To give some background, the Widex Moment app has a function where users can create a custom program using A to B comparisons. That is, the app listens to the sound of the environment that the user is in and then devises two sets (A and B) of HA settings for the user to trial out. The user will select either A or B as giving better hearing.
Based on the user’s selection, the app will then devise another two sets of HA settings for the user to trial out and select which set works better for them.
This procedure repeats itself until the user indicates they are happy with their selection. The user can then save this selection of settings as a custom program.
Widex label this functionality as “hearing aid AI”. I don’t think the smarts are in the HA ; my guess is that the app sends back a sample of the sound environment over the Internet to a central computer at Widex HQ, the computer compares this to previous samples + then sets back a set of A and B settings to the app (I will do a trial with my iPhone not connected to the Internet to see what happens)