Oticon launches Opn hearing aids and Velox platform

With regards to Oticon’s claim that it processes data 50 times faster than their previous platform. I know computer chips have made tremendous progress, but that really is an incredible leap. Any idea how they did that? They give a little info about their computer chip, but that info is not readily available for other hearing aids.

It may be that the previous versions (unspecified) did not try to maximize quantification despite the processor power to do so. If that were the case then a doubling or quintupling of the processor speed might allow such a claim. The same logic applies to the processing algorithm employed. If the software is more efficient it contributes to the 50x claim. I say this without a iota of real evidence. And I doubt it will be forthcoming.

The newer device has 11 cores, vs the 2 in the previous generation - 8 of which process sound. It’s the number of simultaneous operations done on 1.4v which is the clever part.

I’m not sure of the actual specs of the die layer, but it’s claimed that Oticon have basically designed the chip with the end product in mind from the white-board up. That means they aren’t making any generic compromises or using standardised layouts, so I’m guessing the performance leap is partially due to improvements in layer technology and partially the engineering freedom to design a bespoke chip for the purpose.

Thanks! Interesting that they used an odd number of cores. From the research I’ve done, I also find it interesting that they use a larger “process” than most modern cpus. (65nm instead of 14-16nm for newer smart phones and computers) So it sounds like a lot of the increase is because they weren’t really paying attention to processing power before, but now that they are using a new algorithm that can utilize (and needs) more processing power, they’ve stepped up their game. Figuring out anything about the cpus other manufacturers use is like divining tea leaves. I assume Signia uses a 6 core cpu, but that’s an assumption based on their Rexton product going from QuadCore to TruCore (6c). I wonder if hearing aid hardware won’t eventually “commoditize” like it has on computers and smart phones where they all use standardized cpus and product differentiation will be software and other features. Also wondering as hearing aids and smartphones seem to be developing a symbiotic relationship if some of the cpu load (and hence battery demand) couldn’t be shifted to the smart phone. Just rambling. Will be interesting to see where the industry goes.

Just guessing but 1 core for IoT etc; 2 core main CPU; 8 DSP.

65nm should be way less costly to get manufactured and would have greater energy requirements.

According to the Oticon white paper on the Velox Platform built for the OPN, there are 7 cores used for sound processing and 2 cores for wireless processes and 2 cores for MCU. There are 3 ICs, one for DSP (9 cores), one for the Twinlinks, and one Front End.

The 50x performance increase claim is compared specifically to their Inium Sense platform which powers the Alta2, Nera2 and Nia2 lines. They said that the Velox platform operates at close to 500 MIPS and 1,200 MOPS which is 50x more programmability compared to their Inium Sense. I read that the Signia Primax (currently their top of the line) can do 250 MIPS (compared to 500 MIPS on the OPN). The Signia Primax chip has 20 million transistors. The Oticon OPN has 64 million transistors.

They say a typical HA takes 0.02 MIPS to sample the input at 20KHz within one channel with 1 gain applied, just to give an idea of the amount of processing power it takes to do this. They say that the OpenSound Navigator (the DSP algorithm used on the OPN) uses about 3 MIPS at all times.



here they are, shitty focus on my camera :P, sound better than power domes so far, let’s see.

Razer Mamba?

^deathadder elite :stuck_out_tongue: