Artificial Intelligence Brings More Clarity to Hearing Devices

PinIt

Hearing aids benefit their users via AI technologies, but increasingly its hard to find the AI talent to work on these devices.

Hearing aids are perhaps one of the most personal examples of real-time technology in action. Now, they are getting even more powerful, thanks to the introduction of artificial intelligence (AI). However, the talent required to build and support such systems is currently in short supply.

AI isn’t necessarily a new concept for hearing aids, Dr. Chris Heddon of Resonance Medical pointed out in a recent post. “Before the market for AI researchers became white hot, hearing aid companies had been working on various AI and machine learning approaches for quite some time,” he states. However, lately, AI technology has been getting more ubiquitous, and prices for tools and solutions have been dropping dramatically. “The removal of specific technological constraints, combined with the hearing aid industry’s need to address new and disruptive service delivery models, indicates that the time to bring AI to the hearing care market is now.”

See also: Time to put AI in its proper perspective

For starters, Widex recently announced its Evoke hearing aid, which employs AI and machine learning to provide “users the ability to employ real-time machine learning that can solve the tricky hearing problems that users face in their daily lives.”  Evoke’s smartphone AI app, called SoundSense Learn, is designed to help end users adjust their hearing aids precisely in the moment, something no humans can replicate to the same degree of accuracy. As the company puts it: “Most hearing aids give users the ability to customize their sound experience by adjusting frequency bands to boost or cut bass, middle or high tones. Adjusting frequencies works well in many situations once the initial settings have been set by a skilled audiologist. However, some situations are so complex that hitting the right combination of adjustments can be difficult.”

The SmartSense app is connected to the Evoke hearing aids and “uses machine learning to guide the users in optimizing the settings to their exact needs,” according to Widex. “The app gathers a variety of anonymous data such as how often they turn the volume up or down, which sound presets they use and how many custom settings they create – including those made with SoundSense Learn.” SoundSense Learn employs a machine learning algorithm together with reinforcement learning that enables the algorithm to learn in the moment. “The algorithm learns an optimal setting every time a user finds the sound to be a little below expectations in a given sound environment. It learns these settings by simply asking the user to compare two settings that are carefully picked by the algorithm. This allows it to learn an optimal setting in a new environment very fast.”

Ultimately, all the AI technology isn’t packed inside hearing devices — rather, they are outfitted with sensors and receivers that run signals through smartphone apps. Another company, Starkey Hearing Technologies, announced it has added AI capabilities to its hearing aids. The aids connect, via Bluetooth, to a mobile app that adjusts and remembers personal settings. This includes geotagged memories that automatically switch modes when the GPS feature in your smartphone detects you are in a tagged location. “For example, a ‘home’ memory activates when you arrive at home,” according to Starkey.

As seen with these new products, AI-powered intelligent hearing aid systems require three specific elements, Heddon points out:

  • “Hearing aids with energy-efficient wireless connectivity, which gives them access to external computing power;”
  • the ability for a hearing care professional to securely program a hearing aid from a distance, which gives the user access to the highest level of hearing care at all times—even in real-world environments;” and
  • mobile phones with sufficient computing power to run AI on-device, which support the dynamically responsive intelligent hearing aids and provide the additional benefits of protecting user privacy and reducing the mobile device power consumption associated with cellular connection to a cloud-based server (which would have been needed if the AI was run in the cloud rather than on the user’s mobile device).”

Another important element required to advance AI-powered hearing aids is skilled developers who can program and tune these systems — talent that is currently in short supply, Heddon adds. “A connected, intelligent hearing aid requires the support of AI researchers who select appropriate algorithms for optimizing the performance of hearing aids, as well as specialized developers who know how to efficiently program high performance AI into mobile platforms.”

Avatar

About Joe McKendrick

Joe McKendrick is RTInsights Industry Editor and industry analyst focusing on artificial intelligence, digital, cloud and Big Data topics. His work also appears in Forbes an Harvard Business Review. Over the last three years, he served as co-chair for the AI Summit in New York, as well as on the organizing committee for IEEE's International Conferences on Edge Computing. (full bio). Follow him on Twitter @joemckendrick.

Leave a Reply

Your email address will not be published. Required fields are marked *