Inspire the Possibilities with Bluetooth LE Audio and Auracast

February 16 2024, 14:10
An Interview with Chuck Sabin, Senior Director, Market Development, Bluetooth SIG.

As reported in my CES 2024 First Impressions and CES 2024 Lasting Impressions articles, we could feel the momentum as important technologies are already making a difference in the audio products that we will soon see on the market. As expected, at the top of the list are the first LE Audio and Auracast-enabled products.

The Auracast Experience at CES 2024 enabled much more awareness about the possibilities of the technology and certainly caused a lot of press coverage. But it also generated several questions from many manufacturers who are eager to get to market.

Anticipating that scenario, after attending the Auracast Experience at CES 2024, I scheduled an interview with Chuck Sabin, Senior Director, Market Development for the Bluetooth SIG. The goal was to discuss the Bluetooth SIG's perspective of the roadmap for LE Audio and Auracast.
 
CES2024_ChuckSabin-IMG_2124-TWeb.jpg
Chuck Sabin, Senior Director, Market Development, Bluetooth SIG.

audioXpressFollowing the Auracast presentation at the Mobile World Congress last year - which we reported about and was the first public presentation - less than a year later, at CES you actually showed us some product announcements... Give us the Bluetooth SIG perspective on the market momentum and why you're promoting these Auracast experiences.

Chuck Sabin: We felt we were in a unique position to be able to build up this type of an experience on behalf of our members, because the membership was trying to figure out "how we communicate what is possible, what can we deliver on this?". And that's where we felt we needed to take a position of leadership to be able to give this sort of immersive experience, and to provide a forum in which real products coming into market can actually demonstrate interoperability across the platforms.

As we continued to see to a degree, doing this has also actually accelerated the market, because - as you saw at the World Congress, right? - it started off as a very minimal collaboration with Nordic, Qualcomm, NXP, and Intel... I mean, that was really it, right? It was not that many companies.

But then all the other companies saw what we were doing and realized, "Oh, okay, we need to be a part of this. We need to be vocal about what we can do and where we are" to the point where we now have a number of different collaborators that are a part of this - both at the transmitter and silicon level, and at actual product level and delivery. You're starting to see real products implemented into the market. You're seeing people be more vocal about what they're actually delivering on. And hence why we now have a product showcase as well. And we even got companies that were announcing things that they didn't tell us that they were announcing!

It is exciting to see the momentum in the market. Of course, the bigger thing will be when we get the native implementation into the iPhone and into Android. Android's announced it as part of Android 15 (Note: expected to be released in Fall 2024), but we haven't seen what their starting point will be. I suspect it'll be a little bit more focused on the personal sharing scenarios. But if they're able to do those scenarios, then they are able to be an assistant for everything as well. 

You know, we've seen it with Samsung, which updated its headphones and with the One UI 6.0 release that introduced Auracast into the Samsung as a native implementation. And not only does that work with the Buds 2 Pro, you can actually take the ReSound Nexia hearing aids, which is an announced product as well, and you can pair those with the S23, you can scan for Auracast broadcasts, and you can select a particular one, and you can hear it in that hearing aid. So, the interoperability is there, and I'm pretty sure that those two companies did not necessarily test with each other, but the specification proved itself out by being able to operate properly within that range.

CES2024_ChuckSabin-IMG_2071-Web.jpg
"I think the momentum will really ignite even further once Android and iPhone have native support."

People ask me, "How long do you think it's going to take for this to really take off?" Well, I think it's a five-year cycle, right? Starting with the introduction of the specification; and then there are certain milestones that are hit; and then within five years, 100% of new devices have the latest and newest technology in them.

We saw the same thing with the Bluetooth LE radio. The LE radio initially started off with all the IoT-related functionality for Bluetooth, and sports and fitness and wearables, all the things that you see that people connect to. There was all this momentum building in various ecosystems about wanting to deliver those things, sports and fitness devices, heart rate monitors, and so on. But they were all waiting for, you know, the platforms to build it. And as soon as they did, it took off.

So, I'm pretty confident in my prediction that within five years of spec adoption, which we're now a year plus past, that we will see all the new devices, the platform devices, and then that'll allow for the overall market to just adopt completely.

CES2023-LEAudio-Auracast-Showcase-IMG_2067-Web.jpg
The Auracast Experience at CES 2024 included for the first time a gallery of available Auracast products. Aside from known names such as Samsung and GN Group, there were a surprising number of lesser known brands introducing TV transmitters, Bluetooth speakers, TWS earbuds and headphones, and Bluetooth Auracast audio transmitters of all types, including a model from MoerLab that can be configured as a transmitter or receiver (MoerDuo). Another interesting product was the Cear pavé, a portable cube speaker from Cear, a Tokyo-based company that creates and licenses interesting acoustic signal processing technology.

audioXpressTalking about that timeframe, what happened between 2020 when you did the LE Audio announcement here at CES, and the public release of the specifications?

CS: Yeah... Specification development is hard (laughs) Why did it take so long to get LE Audio complete? The hearing aid companies ask me about this all the time as well. And the hearing aid companies were the ones that came to the Bluetooth SIG in the first place, right? And they said, look, we need to standardize because we're all over the place. It's the wild west out there, and Google's doing something with ASHA, and Apple has MFI, and we really need to standardize across all the platforms. So, we really want to do this on LE, get the low power and so on...

But when you crack open a specification that ships 1.5 billion Bluetooth-enabled receiving devices each and every year, there's a lot of people that get involved, right? There's a lot of interest in ensuring that the specification is going to do what it does. And that takes time. It takes time for those companies to sort of work through all the issues associated with the specification. 

So, what happened between CES 2020 and when the specification was finally adopted was that we underestimated what it was going to take to test all the components of the specification. And maybe we're over enthusiastic about how quickly that would actually occur. It just took longer than originally anticipated, and it required updates to the specification to get it right. 

Because the last thing you want to do is release it and not have it be right. 

CES2024-BluetoothLEAudio-Transmitters-IMG_2052-Web.jpg
At the CES Auracast Experience, there were multiple broadcast transmitters, from USB dongles to TV transmitters from ReSound (GN Group) to the Ampetronic/Listen Auracast transmitter that was officially launched at ISE 2024.
CES2024-Auracastdemo-IMG_1044-Web.jpg
The Auracast Experience at CES 2024 met a great variety of attendees, from key industry executives and manufacturers to the consumer electronics and technology press.

aX: Yes, this is a collaborative process. And you have your members, but probably some have more suggestions and more contributions than others...

CS: In the end, they all want to get it right. You have got to think through all of the associated implications. You have got to get the silicon manufacturers to build the silicon as the initial layer and then you have got to get people to build on top of that with the stacks and so on. 

Bluetooth LE Audio is a group of specifications that make up LE Audio. And LC3 was one of the specifications. That's the codec,  and it was the first one that was released because we needed people with their early implementations of LE Audio, and the codec is so integral to the overall functionality. You need to have that set so people can start testing their systems against the approved specification. This is our target. 

And then everything kind of grew from there to get the overall development complete. 


aX: And that's why the specification was built around one mandatory codec, correct? A question we've been getting constantly is if there is space for more codecs on the architecture that has been defined?

CS: Yes. The concept of doing additional codecs is possible. I know it's maybe a very simple example, but Fraunhofer has its LC3 Plus. So LC3 Plus takes advantage of other parts of the codec that are not standardized as part of Bluetooth. Zero latency, lossless-type capabilities, and other stuff that they might do for gaming applications and so on. But it is considered a third-party codec, even though it carries roughly the same name. 

What that means is that other third-party codecs could potentially be a part of it as well. But LC3 will be the new baseline. And it's a higher bar baseline than SBC, most definitely. And it is also the baseline that enables what you saw here today. The Auracast broadcast capabilities that come through the LC3 codec.

CES2024-Auracast_accessibility-IMG_2053-Web.jpg
The use of Auracast broadcast applications for accessibility in a lecture hall. In this simulated scenario, participants show up a few minutes late and can only find seating at the back of the auditorium where, often, it can be difficult to hear the presenter. However, since this auditorium is set up to support Auracast broadcast audio, attendees can listen at the optimal volume on their own earbuds. And, since multi-lingual options were offered by the venue, participants could choose between hearing the lecture in English or Mandarin. This is an exciting scenario for Bluetooth LE Audio.
CES2024-Auracastdemo-IMG_1048-Web.jpg
The Auracast Broadcast Audio at an airport gate is always one of the experiences that is mostly frequently mentioned in reports from these Bluetooth SIG events. And yet, it is one of the most unlikely scenarios for Bluetooth LE Audio given the technical challenges of supplying reliable coverage in common areas of airports - even in airport lounges.

aXIn that timeline, Auracast comes almost at the latest stage,  but the audio sharing idea was there...

CS: Yes, the personal audio sharing idea was there but it had a limited thought process at the time, right? And that's why we've kind of opened it up. If you look at LE Audio, there are three things that we talk about. First are the performance enhancements, allowing higher quality, lower power, lower bandwidth requirements. Getting the higher quality performance out of devices. That includes isochronous channels, multichannel audio capabilities, left, right from the source, all of that comes from the performance enhancements of the architecture of LE Audio.

The second thing that we always talk about is the standardization for hearing aids - a specific hearing access profile and service that's being implemented by the hearing aid companies, specifically to handle and manage that type of resource-constrained device. Ultimately, what we call the Hearing Access Profile, the Hearing Access Service, should ultimately take over that part of it, because all hearing aids will adopt that sort of platform scenario.

And then the third part was the broadcast capability. Now, broadcast is a part of LE Audio. Auracast is a specific implementation of broadcast. So, you're going to see broadcast used in a variety of different ways. You're going to see it for transmitters to multi-point speakers, to surround sound systems, to whatever, because you've got multichannel capabilities. You can parse the channels directly from the source and so on. 

But Auracast is a defined way of using broadcast for this type of public space orientation.
 

CES2024_ChuckSabin-IMG_2119-Web.jpg
"Broadcast was always a part of LE Audio, and it was always a part of the adoption of the specification. Auracast is just a defined way of using it."

There's no difference in terms of hardware, chips, or anything else. It's basically just a defined way for the market on how to use the broadcast capabilities so that there's universal access to everyone in the world. Primarily, it hinges on the difference between standard quality and high-quality implementations of the broadcast capability. And the standard quality is the universal quality for Auracast. Doesn't mean you can't have an Auracast broadcast at the higher quality level, but you at least must have a similar broadcast at the same time in the standard quality because you have hearing aids and other very resource-constrained devices that you want to be able to access that component.

An example would be, let's say, a movie theater. Go into a movie theater and they might have a high-quality broadcast for augmented audio solutions, right? High-end headphones or whatever it might be. But if you're calling it an Auracast broadcast, you're going to also have a standard quality broadcast so that anybody can get the hearing assistance or the augmented audio experience in hearing aids or whatever else. But it's always using the baseline of the LC3 codec. 

So 16kHz to 24kHz is the standard quality and 48kHz is the defined high quality. You can even go beyond that. But that's even beyond where we've thought about with Auracast. And that's for these surround sound systems and other things where they're looking at higher quality and where they've got additional resources to be able to do the broadcast in that higher quality.

So, broadcast can be done in 16/24/48 and beyond, but it's the 16/24 that's the defined standard quality for anything that says Auracast. But you can actually do it in 48. And I believe either a phone or a laptop are actually operating at 48 kilohertz versus 16/24.

ISE2024_ChuckSabin-Listen-IMG_1922-Web.jpg
Chuck Sabin promoting Auracast demonstrations at ISE 2024 in Barcelona. 
ISE2024_ChuckSabin-Listen-IMG_1920-Web.jpg
A different environment, intended to inspire system integration professionals, which will be key for Auracast implementation in public spaces, mostly highly regulated environments.

aX: Going back to this type of Auracast presentation that you started at Mobile World. Did you plan it to address both the public perception, and address the questions from manufacturers? 

CS: Yes, to a certain degree. You want to create demand, right? And you want people to envision what is possible. There are two reasons why we didn't go out on the show floor. One is that wireless demos are notoriously bad at CES because of how much noise in the environment is going on. So, we wanted to make sure that we were going to provide our best foot forward in this type of environment. So that's why we're here. The other reason is that for this particular event we were targeting media analysts, influencers, as well as companies that are looking to potentially implement this into their devices. So, we are going directly to companies that address solutions and advocate for solutions for consumer application of the technology.

The way I look at it is that there's a part of Auracast that's designed for hearing accessibility scenarios. And then there's a part of the Auracast that messages toward augmented audio solutions, right?

And the hearing loss community wants us to really focus on the hearing access components of it. The message that we continue to deliver back to the hearing access ecosystem is that if we're successful in delivering a broad consumer application of broadcast to all the public spaces that are out there, you are the significant benefactor of that delivery. And when you talk about the hearing access side of the market, they target a very, very necessary market, but a very small number of people. If you really want to get broad adoption of the technology and broad adoption of the capability, you should appeal to the masses and deliver additional design to specific needs as we go along.

Coming to CES was really about trying to identify to people that this isn't just about hearing loss. This is about augmented audio and consumer application and challenges that we all deal with on a regular basis - from an audio accessibility for everyone perspective.

 

aXWhat about developers and ODMs?

CS: This experience is created so that they can see the vision of what we're expecting. What we want is give the developer an expectation of what the market is looking for. Basically to say "what you develop should end up looking like this." We also need developers to understand that we're talking about universal access of audio for everyone. As long as you have my receiver and your transmitter, then it works perfectly. This is about people being able to go into any educational institution and know that it's going to work, right? And know that it's going to be possible to work.

From a developers' perspective, we've had programs at the Bluetooth SIG where we've tried to address developers directly. But really, most of their information comes from their suppliers, right - Qualcomm, Nordic, Silicon Labs, whatever? We try not to step on the toes of the suppliers, but it's also trying to give the suppliers the ammunition that says, this is how you use this chipset with the specification to deliver these types of applications. 

We definitely care about the developer. But what we want to show the developer is the vision of what we want. As a quick example, we had a person come in - she's a coder, developer around this type of functionality at a prominent Bluetooth SIG member company, and she was like: "It's so good to see how this stuff is going to be used in the market. It's so much nicer to see what the fruit of my labor is going to look like in the market and how it's going to impact people."

So, to a certain degree, we're the motivators, right?
 

ISE2024_Auracastdemo-IMG_1907-Web.jpg
Unlike the demonstrations held in previous shows, which were promoted in more reserved locations avoiding the wireless unpredictability of the show floor, ISE 2024 Auracast demos worked in very real-world conditions, with multiple sources and transmitters available - as shown on the Samsung UI screen.

aX: That brings us to the question of interoperability. How do you address interoperability, not only for Auracast but across LE Audio implementations?

CS: Yeah... so standardization of the specification is a start, right? What we do as an organization is we have what we call Unplug Fests (UPFs) - yes, we call it "unplug fest" because we don't have any cables, everything's wireless - our model for helping companies test interoperability in a public way.

We operate those quarterly. It's open to anybody. You just register and show up. And we just provide as much space as necessary for the number of companies that are doing it.

It's essentially a speed dating environment. What are you trying to test? What are the things that you want to test? And that goes into a database of what everyone else wants to test. And they test and they talk, and they figure stuff out. That's our model for helping support interoperability testing of products in the market because it's very hard to test otherwise. Even Apple comes to the UPFs and tests against new products coming out in the market. 

You have companies like Teledyne LeCroy and Ellysis and so on. Some of them provide device libraries as well - so they purchase devices, cars, automobiles, head units, whatever it might be - and you can go in and you can test against those sort of devices that are in the market if you can't afford to do your own platform.

Information that comes from the UPF about interoperability actually makes it back into the specification, if there's clarifications that need to be made. The companies that are involved in those UPFs, many of them are also involved in the working groups. If they find a problem, they report it, bring it back to the specification, the specification gets updated, and that helps continue to drive more and better interoperability in the market for the future.

That's our part of trying to help move interoperability into the market.

Again, as I mentioned, many of the companies that are involved in the specification development itself are also involved in those UPFs. And then they get feedback from the UPFs that then ultimately makes its way back into the specification if necessary. Most of the time, it's clarification issues. It's interpretations of the specification. Language is messy, right? And when interpretations happen, then you have to clarify. Clarifications go into the errata in the specification. Anybody who gets the specification has access to both the specification itself and any errata that has been filed. When we version the specifications, that errata gets folded into the next version of the specification. So, our process is designed around continuing to increase and drive additional interoperability into the market.

CES2024_ChuckSabin-IMG_2111-Web.jpg
"All innovation for audio from Bluetooth for the future is going to come in on LE audio."

aX: Do you still believe there are application scenarios for LE Audio that you didn't envision necessarily in the specifications? 

CS: That's a great question. When the LE radio was designed and delivered, I was constantly asked: "What's the killer application?" And I would say, honestly, I can't tell you what the killer application is, because this is a "if we build it, they will come" type of scenario. And that came true, right? We saw things that we never even dreamed that someone would want to have as a connected device. Many of them are here at CES. Many of them we won't even talk about in public. But there are many different types of devices. So, it was basically enabling the developers' desire to build whatever they felt was necessary.

And, we ended up seeing, you know, from tools to toys, toothbrushes, sensor devices, coffee cups, paper airplanes, whatever it was, right? We saw it coming out. 

The great thing about the LE Audio architecture is it's built in a similar fashion in that you can build your innovation on top of the architecture that is available to you now. You're not looking at some monolithic program, you know, application. There's a profile layer now for LE Audio where you can do hearing aids, you can do telephony, you can do gaming. You may look at other profiles for other ways to potentially use audio for a variety of applications. But the architecture of LE Audio provides you that flexibility or provides the organization that flexibility to do that now and build new stuff. 

To me, the architecture is going to allow developers to envision whatever they want to be able to do and do it on the LE Audio architecture. I expect to see a lot of unique stuff come into the market in the future as well.

 

aX: Here at CES we are seeing that with Bluetooth LE audio-based microphones. Apparently, we're going to see binaural headsets that can record, and they will be wireless...

CS: Yeah, and a lot of that is enabled through the isochronous channels capabilities, multichannel capabilities that can separate audio from voice, being able to do mic recording, different surround system recording capabilities and so on. There's a lot of talented, intelligent people in the audio world who are going to find a number of unique ways to use this capabilities and technology. I'm pretty sure. aX


This article was originally published in The Audio Voice newsletter, (#457), February 15, 2024.
Page description
About Joao Martins
Since 2013, Joao Martins leads audioXpress as editor-in-chief of the US-based magazine and website, the leading audio electronics, audio product development and design publication, working also as international editor for Voice Coil, the leading periodical for... Read more

related items