AirPod Pro 2 as hearing device

OK, yeah, I can see that if your AirPods can hardly fit into your ear opening in the first place, then this approach wouldn’t work out well for you like you said. Good thing that you have a flat loss in your left ear. I can see how that would make the AirPods transparency mode work out better for you. If I were you, there would be no need to attempt to wear both at the same time either.

But hopefully this suggestion can help out someone who has a much heavier loss like myself, and who doesn’t have an issue with fitting their AirPods into their ear openings badly. Thanks for pointing out that other caveat about the fitting issue that I didn’t think about.

1 Like

I’m finding that approach works ok for me (with my RIC HA with the Widex open round tips) if I don’t need ANC. You don’t get a completely full seal, according to the AirPod fit test, and bass leakage/attenuation is perceptible if comparing A-B. They are not quite as comfortable as they are without the RIC wires, or near as comfortable as HAs alone. Streaming sound and stability is much better, though.

I imagine Transparency and ANC aren’t intended to be combined with HAs, though I don’t see any big issues if one forgets to turn those features off. You could mute the HA mics, but at that point that would just be the alternative to putting them in a case or leaving them behind.

Good point that you may not get a completely full seal anymore with the AirPods when wearing both. However, in my personal case, I’m still able to get Good Seal result with the AirPods fit test myself. So I think this will vary depending on the personal fit for each person in the first place. I think I’m lucky that my ear canals are big enough to be able to seat the hearing aids’ receivers deep enough into the canal that there’s still enough room for the AirPods plastic tips to continue to make a good seal at the opening of my ear canals.

And I think if you no longer get a good seal like before without the hearing aids, then there’s also a higher chance that it won’t be as comfortable for you anymore with both on. It does continue to feel comfortable for me, though, but then I haven’t really tried to wear both for at least an hour yet because I just thought of this but haven’t had a particular need to wear both at the same time for a long time yet.

You’re right that wearing the aids together with the pods removes the need to put the pods in the Transparency mode because the aids will be giving you that now. You may also detect a short echo if they have different enough latencies. And even if they have the same latency, the double amplification may make the ambient sound louder than you want. And even if you don’t mind that, you can probably save some AirPods battery time by turning off the Transparency mode.

As for the ANC, you’re right that there’s no logic in using ANC while wearing the pods with the aids because the whole point of it is to be able to hear ambient sounds clearly. But if you want the convenience of getting instant quietness momentarily (or even for a longer period) without having to remove the aids, then muting the aids’ mics should do the trick, because there should be enough occlusion with both of them on already. But if still not quiet enough for your taste (like sitting in a very noisy airplane), turning on the pods’ ANC to double up on the quietness may help, too.

The aids’ wires causing a small leak in the pods’ seal is not necessarily bad from the perspective of letting in some of the natural ambient sounds for the ear to hear (where the loss is not bad), and for less occlusion. The integrity and bass of the streaming sound may be affected a little bit due to the leak, but it’d still be a good compromise to accept. You can always adjust the tonal balance of the streamed sound to compensate for this partially.

I’m not sure how streaming sound and stability is improved with this approach. Would you care to elaborate? I’m not sure what stability here refers to. Thanks!

In this instance, the APP2 ear tip is in contact with the RIC wire strain relief. If it was just the wire, it would be a bit more comfortable.

The APP2 latency for Adaptive Transparency seems pretty small, as I didn’t notice echoing in combination with PureSound. That said, I didn’t critically evaluate that aspect.

As for stability, I get some alternative left/right stuttering in MFi media streaming when I’m walking with my iPhone 12 Pro in my pocket.

I have the stock Widex carrying case, but I would like a smaller one that would better fit in my pocket.

It’s SO frustrating that Apple doesn’t let you use Audiogram-based headphone accomodations with an iPad. I have trouble believing this is anything other than a choice they are making to not allocate the resources to program it.

2 Likes

If one looks at this through the lens of their ecosystem and accommodations philosophy, it really doesn’t make sense, especially considering that it shouldn’t take much resource for them to bring iOS features to iPadOS.

It’s doubly frustrating that Audiogram for Transparency has to be turned back on in Settings/secondary menu of Control Center after one has connected the APP2 to a PC or iPad.

Perhaps there’s a dogma at play with respect to Health data privacy and the Audiogram data. The iPhone has a place for that (Health app, data stored locally only); the iPad does not.

1 Like

I know that the version of my iPad doesn’t support the Health app. I thought that somebody on this forum might have said that newer versions of the iPad do support the Health app, but I don’t know if I remember this correctly or not. I may be wrong here. Or the person who said that is wrong.

A quick Google on this seems to confirm that the Apple Health app is not supported on the iPad. It’s only supported on the iPhone, iPod Touch or Apple Watch. I guess Apple deem those more portable devices appropriate to track the personal health data than the iPad which is not portable enough as a health tracking device.

I think Apple’s take that hearing loss is a health issue is not entirely correct. Hearing loss is kind of like a vision issue, you use eye glasses or contact lens to correct your vision, just like you use hearing aids to correct hearing loss (to some degree). It’s usually a static loss in the short term even though it may progress in the long term. So there’s really no need to track this kind of thing in the Health app because it’s not something that fluctuates on a day-to-day or week-to-week basis anyway. So to place the audiogram’s dependence on the Health app is a misplaced and unnecessary limitation.

1 Like

Your comment about hearing loss got my attention.

For me the ability to hear is a life and death need at work and even walking my dog. I use my hearing aids with my iPhone to transmit and receive phone calls. I work in construction.

I believe hearing loss is a health issue.

I will not argue with you there because defining it as a health issue is a semantics that everybody interprets differently, and depending on the context of the discussion. So I will totally respect that to you, it’s a health issue.

To me, I was ONLY talking about it within the context of tracking a hearing loss as a health parameter in the Apple Health app. Almost everything else that the Health app tracks, like your heart rate, your blood pressure, your exercise activities etc, can vary on a day to day basis depending on the situations, so those varying parameters are worth tracking inside an app such as the Health app, so you can observe any trend that is worth observing.

But the hearing loss does not change that quickly, maybe over the years, but sure not daily or even weekly. So it does not need to be in such an app in order to be observed regularly for trend monitoring. Sure, if you go by semantics, it’s a health issue that should belong inside that app. But if going by that semantics results in putting the audiogram inside that app and creates a silly limitation that now any other device which doesn’t support that app is be unable to extract the audiogram out to help with the amplification adjustment of the AirPods, then it’s really silly.

I would rather not have my audiogram be narrowly pigeon holed into such an app simply for the sake of semantics, but at the expense of not being able to use my AirPods with audiogram accommodation for the iPad, and even for non iOS devices altogether.

1 Like

I appreciate your response.

My needs are simple. I need to be able to hear better. Understand speech better specially in noisy environments.

I get listening fatigue mid way through the afternoon.

I wish the process didn’t take so long. Setup I mean

I agree that hearing and vision impairments are primarily health/safety issues, and that it’s on-topic to track them in the iOS Health app (vision isn’t, though). Frequency of variance isn’t really a gating criterion for inclusion in the app.

I can’t think of a good reason to exclude Health from iPadOS, as that’s also a good platform to look at the metrics, even if it doesn’t access all of the necessary Apple sensors to record the data. Also, much of that data can be hand-entered.

By the way, I looked at CPU usage on my iPhone 12 Pro (using CPU-x) while running the Music app, with and without Headphone Accommodations for Media. I didn’t perceive any difference there in utilization, so I suspect this isn’t an issue of battery/CPU load for the Apple Watch.

So far, my perception of audiogram-accommodated Transparency Mode and Media are similar; I tested the same song from the Music app in iOS and in the Apple Music app for Windows (there’s a preview available, streams 256 kb AAC) with JBL 305p MkII monitors in a near-field listening position, and the perceived audibility differences with and without Accommodations were pretty close. Are the Media Accommodations being applied by the source or the sink?

That said, my Widex Moments in Music mode sound better than APP2 Transparency, to a point; the Moment BTE mics are shadowed by my pinnae, so near-field treble response is more muted than in APP2 Transparency.

Another notable thing is that iOS uses 3D scans of the user’s pinnae for use in their spatial audio algorithms, and has pinna-located mics, while the Moment “digital pinna” algorithm seems to be user-invariant, and the RIC version doesn’t have the pinna-located mic advantage. +1 for APP2 Transparency with respect to sound location.

One thing to note: Transparency Accommodation often has to be manually toggled on/off/on to actually be applied, when going from Media streaming from the iPhone or other device. It’s a nuisance, and may also lead users to perceive that Transparency Accommodation doesn’t work.

If putting the audiogram in the Health app doesn’t prevent the audiogram availability to be accessible via the iPad (and for that matter, non iOS devices), then I couldn’t care less that it’s put in the Health app. Personally, I think it should be uploaded to the AirPods themselves so that it can be accessible when the AirPods is used with ANY Bluetooth devices, iOS or non-iOS.

I guess the reason Apple gives for not putting the Health app in the iPad is because it’s not as portable as the iPhone or Apple Watch or iPod Touch. But WHY does the Health app have to in exclusively available to portable devices only? That doesn’t make much sense to me.

I envy your mild to moderate only hearing loss. Apparently the AirPods work much better for folks with mild to moderate hearing loss like yourself, compared to folks with moderate to severe or profound loss like myself, specifically when it comes to the Transparency mode.

I think you’re the first person who shared a comparison between the Transparency and streaming performance based on your personal experience. The 3 reviews of the AirPods from 3 different audis who did REM for the Transparency performance that I watched (and found it not quite up to target compared to hearing aids even for a mild loss patient) never offered any insight as to whether they think the AirPods perform to target when streaming or not.

I personally found the Transparency mode performance for my kind of heavy hearing loss is way underwhelming, while the streaming mode performance for me is barely but adequate to use and still enjoyable because I do get (barely) enough highs, but also plenty of good bass performance to make up for it.

Apple has just announced that the Health app will be available in the next iPad OS. The Health app allows us to upload our audiograms to setup the Airpods Pro.

4 Likes

Yes, I saw too that iPad OS 17, soon to be released, will support the Health app needed for loading audiograms for Bluetooth streaming to AirPods and Beats. That’s great news!

2 Likes

Owners of iPads that will be supported in iPadOS 17 can now install a beta version of the OS and take immediate advantage of features that will be available when iPadOS 17 is officially released, probably this fall.

As part of the terms of service for a beta user, I’ve agreed to not post details about the beta OS publicly, so I’ll just generally say that it’s stable as beta software goes and that I’m not sorry I installed it… if you catch my drift in the context of this thread.

To install it, do the following:

  1. Back up your iPad with iPadOS 16 to iCloud or to a local computer. For those with Windows PCs, you use iTunes for Windows to do this, so if you don’t have iTunes for Windows, you’d have to download that from the Microsoft App Store (it’s free), install it on a PC and link it to your Apple account first. You can backup an iPad with MacOS Catalina or newer using Finder. Backing up the iPad is a vitally important step that should not be skipped, because if you don’t like the beta, you can’t roll your iPad back to non-beta iOS 16 without either doing a clean install (which would wipe out all the data and apps on your iPad) or using an iCloud or local computer backup made from your iPad with iPadOS 16 before you installed the beta. This OS has generally been considered stable, so wanting to roll back may be considered unlikely, but it is always possible that a beta OS will not work with a particular app you really need to work.

  2. If you’re not registered for beta software with your iPad, use your iPad to open a browser and go here to sign up: Apple Beta . You can’t see or install the beta OS on your iPad without registering first with Apple and agreeing to specific terms of service related to use of beta software.

  3. After registering for beta OS, on the iPad, go to Settings > General > Software Update and choose the beta version of iPadOS 17.

  4. Download and install it. On my iPad at the time I did it, this took about 20 minutes.

  5. You can sync your iPhone audiogram settings to iPad If you have already entered your audiogram to the Health app on your iPhone, and if you have enabled iCloud backup of Health app data from the iPhone. If you don’t already allow iCloud backup of Health app data but are willing to do it, on the iPhone go to Settings > [click your Apple ID at the top] > iCloud > Show All > Health > change “Off” to “On” and give it some time to sync up to iCloud. Once your audiogram is in iCloud, when you open the Health app on your iPad, it will ask your consent to sync iPad Health app data with iCloud. Agree, and shortly thereafter, your audiogram is automatically loaded into the iPad Health app and is usable from there. If you don’t want your Health data on iCloud, you can always manually enter your audiogram into the iPad Health app under Health Categories > Hearing.

  6. You still have to enable the first use of the audiogram on the iPad with your compatible AirPods or Beats buds or headphones. First power up the AirPods or Beats and put them in or on your ears–you can’t do this if you’re not wearing them while they’re powered up. Then, same as the way you enable an iPhone to use your audiogram with compatible AirPods or Beats, go on the iPadOS 17 device to Settings > Accessibility > Audio & Visual > Headphone Accommodations, which needs to be set to On. Then click Custom Audio Setup at Settings > Accessibility > Audio & Visual; if you don’t see “Custom Audio Setup” there, go to the top of the left-hand column of Settings, and in the Search window, enter “Custom Audio Setup” to be able to change the settings you need to change. Select your current audiogram, and click Use This Audiogram > Custom > Use Audiogram > [customize any settings you want there] > Done. Now anytime you put the AirPods or Beats in or on your ears and play audio from the iPad, it will automatically customize the playback to your audiogram.

Here’s more pubilc detail about features included in iPadOS 17 Beta: iPadOS 17 Beta Is Here and It's Already Making Things Better - CNET

3 Likes

Did Airfoil run on Win11 without crashing when running the program?

I’m on Windows 11 and I run Airfoil on it and my iPhone 12 Pro Max just fine without crashing. Just need to add -3 seconds delay on the VLC video to slow down the video by 3 seconds to sync up with the 3 second audio delay coming through to the iPhone then AirPods. Same with YouTube videos, I had to add a -3 seconds delay on the Audio/Video Sync add-on extension to sync up the video to the delayed audio.

Do note that they don’t sell anymore Windows licenses for Airfoil anymore. But if you have an old license from before they quit supporting it, that should work OK.

1 Like

I’ve tested Airfoil for Windows on two Windows 11 machines and it won’t run due to an issue with the .NET framework. It works fine on Windows 10.

Which release of Windows 11 are you using? Perhaps it worked on earlier releases of Windows 11 but not the latest?

Did you have to change any configuration or it just worked?

I had to go the the Windows Security → Firewall & network protection then click on Allow an app through firewall then select the Airfoil app (see screenshot below).

Below is my Windows 11 version

image

Once I had it up and running, sometimes I have to select System Audio instead of the VLC app in order for the audio to come through. But I had to go to Control Panel → Sounds and go to the Properties of the default Speakers device and turn off the Enable audio enhancements option in order to get a high enough quality audio through Airfoil. That Enable audio enhancements option is to make the tiny speakers on the laptop sound more bassy but it messes up the integrity of the audio going through Airfoil.

1 Like
5 Likes