Phonak Audéo Sphere

LE Audio has enhanced capabilities over what traditional classic BT can do. It is a completely new standard and should be a big improvement. I expect when they do it, they will add LE Audio and not take away the classic because so many devices don’t support it yet.

WH

2 Likes

Improving connectivity and rechargeable battery capacity, and battery “swappability”, are things that are fairly easy to implement. But I remain extremely sceptical that AI can somehow help us with speech intelligibility or magically enhance the “cocktail party effect”. I wouldn’t spend my R&D money there (R&D is a major factor why HAs cost so much, not the components, and it us paying for this, whether it delivers or not.). Filtering out one voice from a set of similar speakers requires understanding of what is being said, context, and having a few good guesses of where your speaker of interest is going to. This interpretive activity is purely cerebral, happens between, not in our ears. Happy to be proven wrong, though!

1 Like

Are you talking about Auracast or mfi/ASHA when referring to Bluetooth LE / LE audio It gets confusing!

Auracast falls under “technically achievable” and connectivity- yes! My point is I am very sceptical about AI and its return of investment from R&D.

1 Like

Yeah but in marketing these days everything has to be AI. AI is one of those things I think in hearing aids that will be iterative through several generations until suddenly it seems magical.

They are bringing AI on chip de-noising as directional microphones have plateaued, so this is a game changer. Yes, other manufacturers have been using DNN tech but not in this way.
If it delivers and battery life isn’t terrible, you’re probably looking at a substantial lead for phonak in signal processing for noise for quite some time.
They have been developing this for a long time, so any response from competitors is years away:

6 Likes

Thanks for (re-) forwarding the link, that article looks indeed very impressive. In the end, hearing is believing…

5 Likes

LE audio is completely new and has nothing to do with mfi or Asha.

WH

2 Likes

Hearing is believing, for sure this. My colleagues who got to listen to it were blown away, but they were also in a controlled lab environment. I’m looking forward to finding out how it sounds out in the world. We’ll know in a few weeks. AI is everywhere for sure, but denoising is a realistic application that has already been successful in bigger systems, compared to other industries where AI is just a weird gloss. But yes, hearing is believing.

I disagree. Good hearing aid connectivity means better hearing on the phone and in meetings. That’s really valuable for a lot of people.

But I agree with user34, too. The classic Bluetooth has been solid. It’s good enough for now. Of course we all want everything all at once, but given the choice I’ll take big gains in noise and keep puttering along with classic Bluetooth for a while.

10 Likes

The problem is that le audio is new to most devices and will take some time to iron out the bugs. One thing that is needed is hard regulations set between phone manufacturers and hearing aid companies. This is going to be very hard due to the difference in size and power requirement.
If though i have a Samsung phone and really like it Google is the only real choice for hearing aids as for as android. And Apple has lost my my support as Apple seems to keep making changes to the it’s standard to make its airbuds better and not tell the hearing aid companies about the changes until after the fact.
I will be getting a Google Pixel next time.
While i prefer android way too many don’t do any research about phones and then blame the hearing aid compatibility issues on the aids instead of where the real problem is.

By the way I prefer over the ear headphones over my hearing aids for most streaming. Why, people don’t see my aids and just start talking not giving my a chance to stop the streaming. Then they get mad at me for not living my life just for their whim. It never fails they don’t want to talk so I entertain myself by listening to an audio book or music then the decide they want to say something. And they are mad that I don’t dedicate every minute of my life to their whims. I have a life too. So I wear the over the ear headphones to broadcast what i am doing.

3 Likes

I guess we just have to wait and see what Android 15 will bring us.

Furthermore, BT connectivity is very essential for me. Without that I would have difficulty hearing on the phone, etc.

Over the past month I have extensively tested different phones with my Oticon Intent and Resound Nexia. For me, music streaming and hands-free calling were important in this test.

Unfortunately, MFI does not work well here in the house because there is too much inference in the house, making the connection very poor. So I did some research and it is purely my own personal experience.

I have tested various devices including the iPhone 15 pro and Max to various Android devices such as the Samsung S23+ and Ultra, S24 Ultra, Xiaomi 14, Google Pixel 8 Pro and Sony Xperia 1 VI.

What I find about the iPhone is that the Max has a better MFI range than the Pro version. Perhaps it is possible to deal with placement of the antennas?

Samsung sounds reasonably stable, but indeed the latest update has thrown a spanner in the works with regard to hands-free calling. For the rest it also works quite stable, I must say.

Xiaomi fell off fairly quickly for me. Nice device, but very inconsistent when it comes to LE audio. Perhaps in the future with updates it will work better?

Google Pixel 8 Pro is also not all that bad with the mediocre hardware internals and the bad modem is just a big negative for me. The device quickly suffers from overheating and problems with LTE / 5G. LE Audio does work stable, I say with the latest firmware version. I can walk quietly through the entire house, including the garden.
Hands-free calling works great. I am also interested in what the Pixel 9 and especially the 10 will bring us.

I am personally most satisfied with the Sony Xperia 1 VI. LE audio works fairly stable, just like the Google Pixel. Walking throughout the house is no problem. Hands-free calling works great.

3 Likes

I totally agree that a major issue is people not doing research before purchase and then blaming it on the phone or hearing aid. I’m less convinced that Google is the answer for Android. I think both Samsung and Google have issues (and also strengths), but again do your research if connectivity is important to you…

2 Likes

At this time I don’t thing the standards are locked in stone.
The problem I see is the same one that was there back as for as the late 1970s is that technology is moving faster in the development than hardware and software can keep up, it has been out of sync for ever and I don’t see becoming in sync at anytime soon. Development is ahead if hardware and hardware is ahead of software development.

1 Like

Different voices also have different pitches and timbres.

My personal experience after wearing Lumity 90s for 2 years is that they will lock onto the loudest voice if you are in a crowded restaurant with many conversations going on in close proximity at the same time. Basically you can’t hear the person you are talking to at the table but you can perfectly hear the loudmouth guy at the table behind you…hahaha. Not sure how AI would deal with this kind of scenario. With Phonak’s previous version of Stereozoom, the hearing aids would tell the microphones to go into a narrow focus forward mode and turn down the rear microphones so that you could hear the speaker that you were looking at. Lumity’s new version of Stereozoom seems to be dynamic and will focus on what it thinks you want to hear…but sometimes gets mixed up. Maybe Phonak AI chip will leverage the accelerometer to figure out which way your head is pointing and then do a better job guessing.

Thing is…in the restaurant scenario, it is very difficult to sort out what you want to hear. Good example is that you are talking to your dinner guest at the table and the waiter approaches from behind and asks you if you want more wine. How would the AI know to switch focus from your guest to the waiter who is talking from behind? Same thing goes when driving in the car. Perhaps their approach with the AI is to kill the noise and let you hear all the chatter from all directions? This is sorta the Oticon “Open” strategy.

Jordan

6 Likes

Apple to oranges, but color me cynical.
We have been using AI with surveillance cameras for a couple of years. It takes very large models of object images, faces, animals, persons, vehicles, etc. Like most AI applications, it takes a query (graphic in our application) and compares it to that database of objects. In our simple application, we have choices of models, having different types of animals, vehicles, people, faces, whatever. It takes the input, scans the model and comes back with the result and a confidence level.
Our application is simple, we only use a model with vehicles and persons. We look for them near gates and entrances and throw out any false triggers caused by headlights (at night), dogs and cats, blowing tree branches and shadows. At night, a local cat with a long shadow has identified as a person (this really angered the cat).
Using a mediocre video card’s GPU (with 4GB of storage) on a reasonably fast computer, takes up to 50 milliseconds for a result. Without the use of the GPU, it takes hundreds of milliseconds.
So, in my alleged mind, AI’ing the way through a party, bar, or a meeting, voice noise would seem to require a model of some sort to reject all “bad” voices and/or accept all “good” voices. All this keeping close enough sync with the visual of the voice(s) source that you want to listen to. Hmmm, maybe tune out your spouse at times?
I also think that Phonaks competitors are working just as hard.

2 Likes

Ha ha. Maybe. My Lumity are just over a year old. I cant justify it yet. But I’m interested in whats new.

3 Likes

Maybe you have set too much sensitivity for Speech/Motion Sensor? Or maybe it is a possibility to make priority to front speaker in Target software?

Cocoa hearing aids, should be ready not to far I would say 2027.

I wonder if the combination of Lumity + Roger On will still be a better set up than the Sphere alone…