HearAdvisor - Compare Hearing Aid Performance

The new HearAdvisor website is finally live. We’re planning a proper launch with press release etc next week, but in the meantime, I would really appreciate you guys and gals taking a look at the site to see what you like, what you hate, and what we can do to improve things. I think you’ll find the leaderboard, audio files, and gain curves particularly interesting. Thanks in advance!

PS. Oticon Intent will be added next week after we’ve tested it. It’s on the way to the lab now.

Here’s a teaser:

Update. Here’s a copy of our whitepaper: OSF

6 Likes

I am surprised or maybe even confused by some of your results l.The ReSound Omnia 9, Belton Achieve 17, and Jabra Enhance Pro 10 are the identical hearing aids with 17 channels (Scored at 4.0). The Jabra Select 300 is the same as well except it has only 12 channels like the Omnia 5 (Scored at 4.3).

Additionally the ReSound Nexia is a newer product which does have better sound in noise and noise reduction but it is not reflected in your testing (Scored at 3.9). Also the tuned streaming of the Nexia 9s which is the same as my Jabra Enhance Pro 20s sounds like nothing compared to the awesome results of my properly tuned EP20 HAs. Did you not tune the music program and particularly the streaming function?

Something seems off with these results.

I must say though that this is a beautiful site with great information. Certainly a lot of work was put into it! Congratulations on getting it up with so many hearing aids to review, listen to and compare. Thank you for this treasure trove!

1 Like

My 20 year experience has taught me that hearing aid experience is 40% the aids, 40% the audiologist fitting which is 75% listen to me the patient, and 20% my willingness to be patient and give the above a chance. By the way that has lead me to wear Oticon aids for over 14 years now.

4 Likes

Sorry, but the info seems worthless to me. As soon as i see an OTC win as “best” aid with 2 more OTC aids 2nd & 3rd, you totally lost me.

4 Likes

I have to agree with @RSW that I also find the results confusing. The problem is probably with me, but I do need an explanation. @RSW has already presented a comparison of ReSound/Jabra products.

As an additional example, Sony CRE-E10, which are a rebranded version of the Signia Active Pro with their X technology from a couple of generations ago, gets the highest score at 5.0. The Signia Styletto with AX technology from a generation prior to their current technology but more recent than the X, scores 4.2. The Signia Pure Charge & Go IX with their latest technology scores 3.9. The Signia Silk, also with the new IX technology was even worse at 3.7. (Yes, I know that it’s a different kind of hearing aid, but still . . .) What’s going on here? Is the manufacturer repeatedly releasing new hearing aids which perform less effectively than the previous generation?

And this trend doesn’t seem to be restricted to Signia and ReSound. Oticon More scores at 4.6. A year later, Oticon released the upgraded Real and it drops to a 4.4. Not a big difference, but shouldn’t the new generation be at least as good as the previous one if not a little better?

And then there’s Widex. The Moment was released in 2022 and scores 4.3. In 2024 they release the Moment SmartRIC and it drops slightly to 4.2. Not a big deal since the basic technology hasn’t changed, but same pattern as Oticon.

Next is Starkey. The Evolv, released in 2022, scores 4.1. The new and improved Genesis, released in 2023, scores 3.5. Not even close!

I do realize that sometimes the manufacturer may be giving up a little bit to gain improvement in a problem area. That may explain the difference between the More and the Real when wind suppression and sudden noise stabilizer were added. These improvements were not necessarily reflected in the lab results. Or the Widex Moment and Smart RIC, which introduced a shape change without any new technology. But that doesn’t’ explain the overall trends - especially where there is a big drop off from the older generation to the newer one.

I remain confused.

3 Likes

I think you’re all missing the point. Sony is best by far. We should go get 'em and buy Sony stock. I mean, the rating is based on objective reviews…

Sony may work for some but I have no intention on any over the counter hearing aids. I have had a hard enough time finally finding an audiologist that can probably fit my hearing loss.

Great feedback!! I knew sharing it here first was the right idea. Some background: the idea is to bounce this off real people to firm up a set of FAQs for the Leaderboard page, and I feel like we’ve already made a ton of progress. Let me respond to some specific points / questions. My cofounders will probably make an appearance later when they’re available to clarify anything I’ve missed.

We are confused by the difference between ReSound and the Jabra too, but the results are real, and we’ve performed retests to ensure consistency. I can say that the initial fit condition is most heavily weighted in our SoundScore to align with the fact that the majority of hearing aid fittings do not include fine tuning to REM targets, and provided amplification is typically closer to what is prescribed by the manufacturer (often referred to as “first fit”). So it is possible that Jabra has a different first fit gain calculation approach than ReSound. There are also small differences in the tech, like GN reserving their 360 all around directionality (directional in one ear and omni in the other) exclusively for ReSound, as I understand it. So, the products are not identical. I’ll let Steve chip in with any further comments when he becomes available later.

There are so many ways to setup an evaluation in terms of speaker positioning, speech and noise channels, and other sound presentation factors, and then there are many factors at the ear level like acoustic setup of the device (dome type etc), microphone positioning behind the ear, etc. All I can say is that HearAdvisor is the only lab that is subjecting every hearing aid on the market to the exact same acoustical challenge situations, and following the exact same device setup procedure (see our decision trees in our white paper for more on that) and that our results are real. Not pointing the finger at GN here, but manufacturers are known to take liberties in their test setups when it comes to showing differences between previous and new generation to give later generations an edge. However, it’s also very possible that ReSound in good faith established a perfectly real difference in their test setup that we simply aren’t seeing in ours.

I will leave that for Steve as he’s the Lab Director and is responsible for all the fine tuning. We leave as little as possible to subjectivity and we use a Matlab program as a real time assistant for getting consistent REMs target matching for the Tuned condition, but I’m not sure if this applies to streaming.

Thank you!

100%, and that is exactly what we hope to communicate by showing the difference that Tuned can make to sound performance versus first fit!! People must find a provider that performs REMs and knows what they are doing if they want to get maximum benefit.

Remember that the target audiogram for all fittings is the N3 audiogram (mild sloping to moderate hearing loss), so it should come as no surprise that an OTC device (manufactured by one of the world’s leading prescription hearing aid companies) and offering a more occluded fitting than many of the other devices tested, would perform well. Tala has the same benefit regarding occlusion. However, we do intend to include a long FAQ about the downsides of occlusion, and the other factors that need to be considered when looking at our SoundScore. On Lexie B2, again this is a device that is intended for mild to moderate, and Bose have simply done a better job at providing audibility for the Initial Fit (first fit) condition that prescription manufacturers, and I said before, this is more heavily weighted in our SoundScore.

Now I’ve simply run out of time to respond, so I will ask Steve to clarify any points I’ve made and respond to your questions @billgem

Thank you all for the great questions.

2 Likes

I actually did but the “Sony” CRE-E10. I liked them very much! A very good pair of hearing aids, which not only did well with speech recognition but were surprisingly good with speech in noise as well. I returned them, figuring that if they were this good, the more programmable Signia hearing aids with more advanced technology would have to be even better. Apparently not.

Thanks for your feedback everyone.

This is a great point and question. We do not tune devices, or alter the ear tips, for music listening. Our programming focuses on speech clarity as this is the prominent concern of most people. There are many variables here and we chose a path closest to what most people will experience (e.g., relying on manufacturer defaults). Our whitepaper includes a decision tree for this but happy to answer any other questions though.

1 Like

We were very confused by this too as hearing aid companies invest a lot in R&D. However, it stands to reason that every product launch is likely not as revelatory as the marketing may suggest (especially given the accelerating period between launches). Something we have noticed that may explain this is a decrease in default gain settings of the manufacturer proprietary fitting algorithms. Around 3000 Hz is the most important frequency range for speech intelligibility, less gain will result in lower scores.

Here is the default gain we measured for Starkey Evolv AI (Top) and Genesis AI (Bottom):


Here is the default gain we measured for Signia Styletto AX (Top) and Pure C&G (Bottom):


This trend of lower default gain around 3000 Hz can be seen in many successive product launches. We are still looking into this trend so thanks for mentioning it.

2 Likes

Thanks, Steve! Much appreciated.

I’ll look forward to hearing what you learn about this trend.

We need an irony emoji. Or is there one that I missed?

My concern with this type of rating website is that the ratings need lots of disclaimers.

The OTC Sonys may be the best OTC aids. They may be the very best aid for some people, but I doubt they are more capable than prescription aids, and yet that’s what the rating says, IMO. But ‘better,’ ‘worse,’ ‘more or less capable’ haven’t even been well-defined.

I’d like to see experts avoid comparative ratings. Choosing a piece of tech like a hearing aid simply can’t be reduced to a single number. The variables are too diverse.

Consider Consumer Reports. At various points in my life I’ve ben an expert in 35 mm cameras and hi-fi equipment. None of my friends agreed with CR’s ratings of the products we were expert in; we used our own judgment. But we all used CR for purchases of items we knew nothing about. How does that make sense?

What CR sometimes does that IS useful - and what HearAdvisor can do, too - is report what users say about their choices, as in 'We surveyed N people, of whom m, l, o, p people had ___ level of hearing loss. ‘X number of people gave these ratings for those parameters. Y gave these other ratings.’

The forum doesn’t conglomerate data, but members provide a lot of good qualitative and comparative info when they explain why they rate the HAs they’ve tried the way they do.

Ratings sites treat the products they cover as commodities. But many items only seem like commodities. There are differences between products. Those differences may be meaningless to most people, but they may be meaningful to lots of us.

No one can test all models of all HAs under all conditions, but people come in lost of shapes and sizes, and people have lots of differences in what they like and dislike. How can you think a single number can describe a HA?

3 Likes

don’t forget objective reviews can be done with flawed methods. I don’t know anywhere near enough about HAs or proper testing methodology to say that is/isn’t the case here, but I am extremely skeptical that an OTC is “objectively the best option”.

1 Like

This is absolutely something I overlooked, and makes a ton of sense. Now I’m wondering if me not realizing that was a me issue (e.g. skim reading somthing I shouldn’t have) or if that needs to be made more prominent on the site :slight_smile:

Despite my general skepticism and complaining- I think this has the power to be a very useful resource, and I really respect the work that’s been put in here.

1 Like

I think what you seem to be missing here is that we’re merely reporting on the sound performance of the devices, but this is ONLY one dimension that should be considered when purchasing a hearing aid. If it is the absolute most important thing to you (and you are not concerned with all-day use, comfort, occlusion, usability factors, etc) and you have mild to moderate hearing loss, then yes, we are saying the Sony OTC is a really good option…

2 Likes

Hi there @philbob57

Interesting points. Of course, as this project is over a year old now, we’ve spent a lot of time thinking about these very considerations. In terms of the big picture here… I don’t agree with you that user feedback is more valuable than rigorous objective scientific evaluation. Keep in mind that I’m the guy who started the world’s first real hearing aid review platform and run this forum, so keenly aware of the value of user feedback, but also aware of the limitations. As you rightly pointed out, people come in all shapes and sizes, and this means user experiences with hearing aids will vary widely. Add to this the effect of individual hearing loss and the level of experience and skills of the expert involved, and you have a recipe for chaos. Not to mention the dreaded world of Bluetooth, connectivity, firmware, etc. While there will definitely be some common patterns of praise or complaints when it comes to specific models (I’m excited about the potential of AI/LLMs for tackling this, and have been involved in academic research on the topic), I would argue that this type of analysis is far more useful for understanding general usability issues than for assessing the efficacy of the sound presented. It’s my belief that you would struggle to get useful information about sound performance for a particular patient population (like mild to moderate loss) without having a large group of patients with the same hearing loss fitted with the same models of hearing aids using the same fitting protocol by the same experts in a true randomized double blinded experiment, which as you could imagine is very unlikely from a funding perspective, and due to the delays in performing and reporting on this type of research, would likely yield out-of-date information, probably only good for comparing last-generation models. I would argue that something is better than nothing here (even heavily disclaimered) as there is a dearth of truly scientific data about hearing aid sound performance, and yes, there are very big real world differences in terms of the devices on the market. Our important work has helped to identify a number of devices that should not even be on the consideration list of consumers, and that is probably the biggest value that a project like this adds. That is simply not possible with any type of long time window academic research… and the truth is no one wants to fund this type of research anyway. Why would they when it only puts their marketing claims under scrutiny?

One more thought. One of the big issues with subjective feedback about sound performance is that people often mistake amplification efficacy (what we care about) with sound quality. HearAdvisor’s SoundScore is heavily weighted to the hearing benefit one would receive in quiet and noisy environments while using the hearing aid (with a mild sloping to moderate hearing loss). So, please do keep this in mind, and we’ll be sure to make a big point of this in our upcoming FAQs.

First, thanks for this website and forum.

We’ll see about the review website. If it’s useful, it should experience a growing flow of traffic.

1 Like

The plan is to get our expert choice badge on product boxes eventually so people know they can trust a hearing aid to at least provide adequate basic hearing aid functionality. Sadly, this is needed as consumers are being swindled daily buying poorly functioning devices on amazon, walmart, and facebook ads… These badges are absolutely needed, and I have no doubt that you will soon see the badge in best buy and anywhere else you might purchase hearing aids. To be clear, this project was born to deal with the increasingly confusing marketplace spawned by the OTC rules. When it was just prescription aids, this was far less necessary.