This Startup Has Created AI-Powered Signing Avatars for the Deaf

this-startup-has-created-ai-powered-signing-avatars-for-the-deaf

More than 70 million deaf or hard-of-hearing people globally use sign language, but there’s an acute shortage of interpreters. Silence Speaks is a British startup that wants to bridge that gap with an AI-powered sign language avatar capable of translating text to sign language.

Communication problems can be devastating and isolating for the deaf, especially in environments with background noise and crosstalk. It’s often impossible for deaf and hard-of-hearing folks to follow conversations in train stations, hospitals, school classrooms, and busy offices. Even people with cochlear implants pick up only a few words from each sentence and can struggle with tone.

Developed by and for the deaf community, Silence Speaks can accurately translate text into British Sign Language (BSL). There are more than 150,000 BSL users in the UK. The model was trained with datasets covering regional dialects, contextual language, and emotional tone to power an AI-generated avatar that goes beyond simple direct translation to convey intent and emotion.

WIRED has a policy of not sharing a story with a company before publication, but we sent an edited version of this piece to Silence Speaks so that the company could generate an AI avatar based on me for a sample video. There were no changes to the text.

Accessibility Barriers

With a background at Vodafone pioneering AI architecture for chatbots, Silence Speaks CEO Pavan Madduru founded the company just over three years ago. He got the idea after spending time with an engineer friend who was deaf. When the two traveled together, Madduru saw how the lack of sign language support made life difficult for his friend and resolved to tackle the problem.

He hired three full-stack engineers, all deaf sign language users, to begin work on translation technology to harness the growing potential of AI. When he started the project, the team tried converting sign language to text, hoping to enable real-time translation of any sign language to any other sign language. But this proved trickier than expected.

While American Sign Language (ASL) is widely used in the US, BSL is quite different, and there are more than 200 sign languages around the world. Silence Speaks started with BSL but soon realized that regional slang and accents extend to sign language and impact signing speed. During the trial, they discovered signers in Edinburgh were 50 percent faster than folks in Southampton or Birmingham.

After a rethink, Madduru and his team approached the problem differently, developing technology capable of translating text into BSL and ASL. The current version runs in the cloud and takes time to generate the signing from text input, but the company is working toward real-time on-device translation. Madduru suggests this is still at least a year away, partly because the team has set high standards for accuracy. But real-time sign translation could effectively give deaf folks an interpreter in their pocket.

Photograph: Silence Speaks

A future version that can translate speech, text, and sign language in any direction is a truly tantalizing prospect. For now, the focus is to build a strong foundation, prove that the technology works, and show it has a role to play in immediately improving accessibility for the deaf. The platform includes built-in captioning functionality for deaf users who don’t sign or are still learning sign language.

Transport Trials

Silence Speaks is getting a huge lift with Transport for London—the agency responsible for more than 250 stations and all the Metro lines in the UK’s capital. It plans to roll out the Silence Speaks AI-powered sign language avatar to make trains more accessible for the deaf and hard of hearing by offering visual displays for train announcements. People can also scan QR codes on their phones to get sign language videos with the latest information.

A greeting cards company also employs Silence Speaks technology to create birthday and other event cards. Each card contains a QR code that the recipient can scan to stream a special signed video message on their phone.

One of the cool things about the tech is that licensees can create their own characters to serve as signing AI avatars. Characters can be photorealistic or cartoonish. For this article, I submitted a single photograph to create my avatar for the videos. While AI usually struggles with hands, my digits look perfect for accurate signing, but my face is downright weird (weirder than usual). My kids find it creepy, but it’s not a million miles away, and companies will likely spend longer refining their chosen characters. There is even scope to use popular characters like Spider-Man or Wonder Woman as sign interpreters.

Getting on Board

Silence Speaks has been gaining momentum ahead of another investment round. Chloe Smith, former secretary of state for Science, Innovation, and Technology, who led the 2022 legislation to recognize British Sign Language in UK law, has joined as chair of the company’s board.

“People who use sign language can often be excluded, and that’s wrong,” she tells WIRED. “We hope that our app will help deaf communities and hearing communities to communicate better together.”

One of the main applications is for the deaf and hard of hearing in the workplace, where accessibility can be challenging. The Royal National Institute for Deaf People reports that 85 percent of deaf professionals have suffered workplace exclusion. Smith is also president of the Chartered Institute of Personnel and Development, the leading professional body for HR and people development in the UK. She suggests that technology like Silence Speaks has a huge role to play in breaking down barriers and extending the frontier of inclusion.

Education is another sector crying out for better accessibility. There are around 50,000 deaf kids in the UK education system, and the vast majority of them attend mainstream schools where resources are stretched. Without proper support, these kids are in danger of being left behind, but few schools will provide a sign interpreter, and where they do it’s often only for part of the time. This technology could massively improve classrooms for signing kids.

Silence Speaks isn’t the only new technology targeting the deaf and hard-of-hearing community. As smart glasses start to take off, we have seen companies like XRAI attempt to subtitle life for the deaf and hard of hearing, and Nuance now offers audio glasses with built-in microphones and speakers to amplify sound and act as hearing aids. Google showed off augmented reality glasses in 2024 that provided real-time captioning on the glasses while speaking to another person, and even Apple’s popular AirPods Pro 2 got a free update in 2024 that enabled hearing aid functionality to boost certain frequencies. Different people will prefer different approaches, and there’s no reason these technologies can’t coexist.

As Silence Speaks develops its technology, authenticity is key, something everyone at the company mentions when I speak with them. The involvement of deaf engineers, deaf communities, and linguists in the design is central to the company as it seeks to provide an intuitive system that feels authentic to users, and that’s not always the case with this kind of technology.

Related Posts

Leave a Reply