Hearing aids help the hard of hearing connect with the world around them. We look at what technology is on the horizon to improve these connections, and even enhance sound for those with normal hearing.
Since 1898 hearing aids have transformed the silent world of the deaf and hard of hearing, improving communication and quality-of-life for millions of people. In recent years, they have metamorphosised from large, beige clunky devices into slim-lined and discreet high-tech aids hidden in the ear canal.
But what of the future? Are hearing aids poised for an artificial intelligence (AI) revolution, and will newer microchips responsible for sound processing enable hearing at normal or super-human levels? Here, three experts discuss some of the latest innovations and what is coming.
One of them is Dr Jorge Mejia. For 20 years, he has worked as a researcher at Australia’s National Acoustic Laboratories (NAL), developing algorithms for hearing aids and helping people navigate the complexity of hearing loss.
“Whatever we do, it needs to get to the clinicians and their clients, and change something fundamental in a positive way about how they live their lives,” he says. “The fact we have identified ways to improve people’s ability to communicate and make life easier, that is the reason we wake up and every day come to the office and do what we do. Feedback from our clients is meaningful and important for us to continue to be motivated to achieve the things we need to.”
An electrical and computer engineer with a PhD in medical engineering science, Dr Mejia began his career working in satellite development in robotics. He then joined NAL, the research division of Hearing Australia based at the Australian Hearing Hub at Macquarie University, where he is head of the signal processing department.
“In the early days one of the problems was feedback cancellation (whistling) which remains a challenge today, but NAL gave me that challenge to look into and that’s how I became involved in hearing health,” he recalls. “That was in the early 2000s and since then I’ve been working at different stages of research from ideation to product development and commercialisation.
“One of the major initiatives in NAL in recent years has been the drive to enhance translational research and commercialisation.”
AI-driven technologies
The organisation is invested in exploring the integration of artificial intelligence (AI) applications in hearing health, with research focusing on leveraging AI to enhance diagnostics, treatment, and rehabilitation for individuals with hearing impairment.
“We believe in harnessing the power of AI to revolutionise hearing healthcare, making it more accessible, personalised, and effective. The aim is to develop cutting-edge AI tools and technologies that can assist audiologists in providing better care and improving outcomes for patients,” Dr Mejia explains.
He says the whole industry has a significant emphasis on AI and it has already been used in different shapes and forms.
“There are many stages patients interact with clinicians, interact with technology, so most of the big manufacturers today have incorporated some form of AI into hearing aids to tap into this,” he says.
In the future, he predicts hearing aids will be characterised by continuous innovation and integration of AI-driven technologies.
“We envision hearing aids that are smaller, more discreet, and more powerful, with AI algorithms that can dynamically adapt to the user’s unique listening preferences and auditory needs,” he says.
Industry experts say AI and machine learning (ML) will also enable more personalisation and situational awareness in hearing aids, requiring the consumer to do less manual adjustment. For example, AI will be able to track the direction a hearing aid wearer turns their head towards sound to maximise what they hear from that direction.
Dr Mejia says audiologists can best prepare for this future by staying informed of technological advancements, participating in professional development activities and collaborating with industry partners to ensure they can effectively incorporate innovations into clinical practice.
The AI-powered hearing aids which already exist have advanced features including ML-based sound processing, automatic environmental adaptation, and wireless connectivity, he explains.
Some incorporate sensors for health monitoring and ML for adaptive sound processing while others have AI-driven features like binaural directionality for spatial awareness and environmental optimisers for real-time sound optimisation.
Another AI specific hearing aid launched in 2024 is the “biggest leap forward ever in hearing technology,” according to its manufacturer, with AI analysing and automatically optimising sound more than 80 million times an hour in this particular hearing aid. A processor designed to work like the human brain leverages the neuroscience of the ear-brain connection and information processing to create better sound quality, the manufacturer says, and it also incorporates health and wellness features, being the first to integrate sensors, track and encourage social engagement, and enable counting steps.
Another company says it has been using ML in its hearing aid technology for more than two decades. Its latest AI application uses an ML-model trained on thousands of real-world sound recordings tagged to indicate different environments. These hearing aids adapt automatically by selecting the best settings for every listening situation and zoom in on a single voice in a noisy environment.
AI advantages
Advantages of incorporating AI in hearing aids include that AI algorithms can adaptively adjust to various listening environments, optimising sound processing and improving understanding of speech in noisy situations, Dr Mejia says.
“Secondly, AI-powered features such as sound classification and noise reduction enhance user experience by reducing cognitive load and improving comfort,” he adds. “Additionally, AI enables advanced connectivity options, allowing for seamless integration with other smart devices such as phones, laptops and telehealth platforms. AI in hearing aids can improve clarity of speech and assist users in self-managing the technology.
“Thirdly, there are (phone) apps where you can query the app as to whether there’s a problem with your hearing aid or listening situation, and the app will tell you ways to fix the issue.”
Manufacturers are leveraging AI to do that on large language models but while these tools are out there, people cannot always take advantage of the technology. NAL is developing tools and solutions to support this ecosystem, from diagnostics to treatments.
“While AI is increasingly becoming a standard feature in hearing aids, we don’t foresee it ‘taking over’ entirely,” Dr Mejia adds. “Instead, AI technologies will likely be integrated into a range of hearing aid models and there will be different price points and user needs.
“While premium models may offer more sophisticated AI capabilities, basic functionalities will still be available in more affordable options. Ultimately, the goal is to ensure that all patients have access to AI-enhanced hearing solutions that deliver optimal performance and improve their quality of life.”
Dr Mejia reminds that the ecosystem is not only about technology but people and their social interactions with others. People come forward because of a hearing problem which might translate to, “I need a better job, I need to talk to my wife and understand her; she complains that sometimes I don’t get why she’s so angry. I want to hear my children or grandkids. Sometimes I don’t know what they’re talking about.”
“There are many things we need to understand and we’re at the early stages of tapping into the power of AI to do that.
“AI can play a significant role when you have a conversation with someone with a hearing impairment. That person, for example, might not hear the spoken emotions you express in your voice. AI can analyse this and the context. It could potentially, and this is not far in the future, give you an additional cue to help you realise that this communication is getting a bit heated, you need to alter the way you communicate because it’s going in a certain direction.”
Personalisation of technology
Hearing aid processors are also being trained to personalise the environment in the way the wearer wants to hear the sounds around them. Whether in a conversation, a restaurant, a quiet room, or watching TV, users previously adjusted settings on their device but now AI in hearing aids automatically does this as part of ML methodologies that analyse the environment and every change, every second, produce an analysis telling the hearing aids where you are and adjusting behind the scenes.
Instead of the patient focusing on adjusting settings, they can focus on the conversation or what is in front of them, freeing up more cognition and concentration.
“AI is taking away the complications and streamlining the process – that’s what we call personalisation of technology,” Dr Mejia explains.
To achieve this, hearing aids are linked to apps which, apart from remote hearing aid adjustment, can produce real-time monitoring of hearing health metrics, and access personalised rehabilitation exercises. Some aids and hearables track health data, similar to an Apple watch.
Advanced chip technology
Hearing aid chips are also becoming more complex as AI architecture is embedded into them and Dr Mejia thinks this will increase exponentially. While their tiny processors are remarkable in what they can do, AI often requires a lot more power than is currently possible.
“We are hitting some limitations in what we can do with hearing aid processors, what we can put into these microchips,” he says.
At the online Future of Hearing Healthcare Conference in May 2024, Dr Mejia discussed digital tools that enable patients to take greater control of their hearing health and participate more actively in their treatment journey.
He says NAL researchers are harnessing the power of AI to create tools for precise speech-in-noise assessments, enhancing the accessibility of hearing screening tools for adults and children. They’re using AI-powered assistive technologies to facilitate self-management of hearing aid devices. One example uses a form of AI , a neural network, for more precise diagnosis of sensorineural hearing loss in any language.
User-intent sensors
“Another innovation hearing professionals can expect to see more of in future hearing aid technology is user-intent sensors,” says Gold Coast audiologist and founder of Value Hearing, Mr Christo Fourie. He believes the most impactful hearing aid in 2024 is the Oticon Intent because of its new chipset, the Sirius, which contains world first 4D user-intent sensors.
Fourie says this type of technology is about giving the client as accurate information as possible about the environment through the auditory system, including spatial awareness and natural hearing, without over-compressing or over-processing sound waves.
“It’s the first time it’s actually been in a hearing aid, processing the sound directly. With most other hearing aids, the AI is about adaptively changing the settings of the hearing aid based on the environment. This is the only hearing aid that has deep neural networks – a form of AI – on the chip in the hearing aid and that is an advanced, technically complicated process to achieve.”
A better chip increases the processing speed which means the accuracy of the sound coming into the ear is better. Things like distortion, or compression due to the limits of what the chip can process, are reduced and the amount of information coming in with a new chip is also improved, Fourie explains. The more powerful the chip, the more AI algorithms can be added.
“People are noticing the clarity and naturalness with the new chip because it’s all about reducing the listening effort by giving the brain all these subtle cues which are typically lost in processing,” he says. “The numbers we’re fitting are creeping up even though it’s more expensive.”
Bluetooth low energy audio is also coming to that chipset, he says.
‘Super abilities’ to communicate
According to Dr Mejia, hearing aids will be relevant for decades to come. “In fact, the idea that hearing aids are only for people with hearing problems is going to be less of a thing in the future.”
With the popularity of earbuds skyrocketing, hearables are the next big thing. The technology increases the chance for people with normal hearing to maximise this sense by interconnecting with other technologies.
One example is Auracast broadcast audio. It’s a new Bluetooth technology enabling people to focus in on a conversation in a noisy restaurant, a TV screen across a loud sports bar, hook directly into a public address system at an airport, or connect directly to a musician at a concert, blocking background noise. A small transmitter allows electronic devices to broadcast audio directly to the wearer.
While developed for those with normal hearing to target sound, the hearing impaired will also benefit. Several companies are using Auracast already in hearing aid devices and everyone in the industry is looking to adopt it, Dr Mejia says.
“Soon we are going to see this being more uniformly used because the potential to create the opportunity not just for hearing impaired people, although they would benefit a lot from this, but for all of us, is going to be immense,” he says.
“This integration of technology to our lifestyle will become more normalised and future technologies are going to create super abilities to communicate and remain connected. Language barriers will no longer be there. My normal ability to hear sounds may not be sufficient so this will enhance it.”
Mr Andrew Bellavia, an American thought leader in hearables and hearing health and founder of Aurafuturity market and branding consultancy, spoke about Auracast at two conferences in May 2024: the Australian College of Audiology (ACAud) conference in Cairns and the Hearing Health and Technology Matters’ (HHTM) Future of Hearing Healthcare Virtual Conference. He is also co-host of HHTM’s This Week in Hearing podcast.
“Auracast is already in Australia – you can buy a Samsung TV that streams Auracast or if you have an older TV, you can put an Auracast streamer on it,” he told delegates. “You can share audio to multiple speakers, two people can watch a movie and both listen through headphones.
“Soon it will be in public spaces where people will get the most impact including hearing-impaired people. Multi-screen venues will be next because of the mass market potential. At a noisy sports bar you will be able to tune in to the audio on every screen.
“People say designing for accessibility is for everyone but this is the opposite – designing for everyone is designing for accessibility because the mass marketing possibilities of Auracast will drive more installations and that will benefit the hearing impaired.”
Fourie believes Auracast will eventually replace telecoil because Auracast transmitters can be in any public setup and anyone with headphones, earbuds or hearing aids will be able to access nearby audio.
“Your partner might be using headphones and you might have hearing aids and you are watching the same movie but both can listen to the audio through different devices from that same transmitter, your tablet or phone. That will open up a whole lot of connectivity to people,” he explains.
“Auracast is a new protocol that creates a much more usable form that addresses most of the concerns we had with older Bluetooth technologies that weren’t designed for hearing aids and did not take hearing aids into account.”
Fourie reminds, however, not to underestimate the importance of a good fitting when it comes to hearing aids.
“There’s a lot of focus on products but I believe a poor product fitted with best practices can outperform a good product fitted poorly,” he says. “I think audiology worldwide has hurt itself by focusing on the product, rather than what audiologists can do because if we have a product we are comfortable with, knowing its ins and outs, understanding how it works for different clients, following best practices, we can make that work very well.”
Dr Mejia tends to agree. “Despite all the hype about AI creating automation, at the end of the day, human factors are still important. Audiologists around the world and clinicians play an important role in keeping people motivated, on the right pathways towards what they need to achieve.
“It’s not just about having a cool piece of technology in your ears. It’s also about understanding how the technology is impacting you and how to best use it.”