The COVID pandemic disrupted the daily lives of people everywhere, but many of them with hearing loss faced the isolation very severely. “When everyone wears a mask they are completely incomprehensible to me,” said Pat Olken of Sharon, Massachusetts, whose hearing aids were inadequate. (A new cochlear implant has helped her a lot.)
So when her grandson’s mitzvah bar was leaked to Zoom early in the pandemic, long before the service offered subtitles, Olken turned to Otter, an app created for transcribing business meetings. Reading along with the ceremony speakers made the application “a huge resource,” he said.
People with hearing loss, a group of about 40 million adults in the United States, have long embraced technologies to help them find their way into the world of hearing, from Victorian-era trumpets to state-of-the-art digital hearing aids and cochlear implants. implants.
But today’s hearing aids can cost over $ 5,000, are often not covered by insurance and do not work for everyone. The devices also do not focus the audio on the way the glasses immediately correct vision. In contrast, hearing aids and cochlear implants require the brain to interpret sound in a new way.
“The solutions out there are clearly not a one-size-fits-all model that does not meet the needs of many people based on cost, access, many different things,” said Frank Lin, director of the Cochlear Center. for Hearing and Public Health at Johns Hopkins University. This is not just a communication problem. Researchers have found correlations between untreated hearing loss and higher risks of dementia.
Cheaper hearing aids are coming without a prescription. But for now, only about 20% of those who could benefit from hearing aids use one.
Subtitles, on the other hand, are usually much easier to access. They have long been available on modern TVs and appear most frequently in video conferencing applications such as Zoom, streaming services such as Netflix, social media videos on TikTok and YouTube, cinemas and live arts venues.
In recent years, smartphone applications such as Otter. Google live transcript. Ava; InnoCaption, for phone calls. and GalaPro, for live theatrical performances, have emerged. Some target people with hearing loss and use human reviewers to make sure the subtitles are accurate.
Others, such as Otter and Live Transcribe, rely on what is called automatic speech recognition, which uses artificial intelligence to learn and receive speech. ASR has problems with accuracy and delays in transcribing spoken speech. Built-in bias can also make transcripts less accurate for the voices of women, people of color and deaf, said Christian Volger, a professor at Gallaudet University who specializes in accessible technology.
Phraseology and slang can also be obstacles. But users and experts say ASR has improved a lot.
Although welcome, none of these solutions are perfect. Toni Iacolucci from New York says her book club could drain even when she used Otter to transcribe the conversation. The captions were not always accurate and did not identify individual speakers, which could make monitoring difficult, he said.
“It worked a little,” said Iakupci, who lost her hearing almost two decades ago. When she returned home, she would be so tired of trying to follow the conversation that she had to lie down. “It just takes so much energy.” She had a cochlear implant a year ago that has significantly improved her ability to hear, to the point where she can now talk one-on-one without captions. They continue to assist in group discussions, he said.
Otter said in a statement that it welcomes feedback from the deaf and hard of hearing community and notes that it now provides a paid software assistant that can attend virtual meetings and transcribe them automatically.
Transfer delays can present other problems – among them, a concern that interlocutors may become impatient with delays. “Sometimes you say, ‘Sorry, I just have to look at my captions to hear,'” said Richard Einhorn, a musician and songwriter in New York. “That does not mean I do not know that sometimes it’s a nuisance to other people.”
Other issues appear. When Chelle Wyatt from Salt Lake City went to her office, the Wi-Fi there was not strong enough for the transcription app to work. “It was gestures and I was writing things and I made sure to get a written report afterwards so I know what was said,” he said.
Cinemas provide sound-enhancing devices, as well as glasses and individual screens that display movie subtitles. But these are not always comfortable and sometimes they are not well maintained or just do not work. Many people with hearing loss want more movies to show subtitles on the big screen, just as you would in the comfort of your own home.
A new law that went into effect in New York on May 15 requires movie theaters to offer on-screen subtitles for up to four hours per movie per week, including the most popular hours to go to the movies – Friday night and weekend. Hawaii passed a state law in 2015 that required two screenings per week for each movie with captions on the screen. AMC, the major film chain, also says it screens some captioned films in about a third of its movie theaters in the United States.
Subtitles are now more available for live streaming. Several Broadway theaters promote a smartphone app with captions for live performances. There are also individual handsets that display subtitles. Theaters also have some performances with “open captions” that everyone can see.
During the pandemic, the shift to online dating and school meant that video conferencing services became a survival tool – but the subtitles only came after a big boost. Zoom added a live transcript to its free service only in October 2021, but the meeting host must activate them. Google Meet made faster subtitles available to everyone for free in May 2020. Microsoft Teams, a workplace messaging app, launched it in June.
“We need captions everywhere and we need people to be more sensitive,” Olken said. “The more I support, the more others benefit.”