Google shared details about what is new in Android Q. One of the best new features makes smartphones more accessible to people who are deaf or hard of hearing. The feature is called Live Caption.
We are also seeing many firsts in software driven by on-device machine learning. One of these features is Live Caption. For 466 million deaf and hard of hearing people around the world, captions are more than a convenience – they make content more accessible. We worked closely with the Deaf community to develop a feature that would improve access to digital media.
With a single tap, Live Caption will automatically caption media that’s playing audio on your phone. Live Caption works with videos, podcasts and audio messages, across any app – even stuff you record yourself. As soon as speech is detected, captions will appear, without ever needing WiFi or cell phone data, and without any audio or captions leaving your phone.
The Verge reported that the Live Caption transcription appears in a black box that can be freely moved around your phone’s screen to wherever it is most convenient. The feature works even if your volume is turned down or muted. Transcriptions cannot be saved for later review.
You can find Live Caption, and other Android Q features, in the Android Q beta. The beta is available for 21 devices from 13 brands, including all Pixel phones.
Accessibility is very important, and Live Caption will make it easier for people who are deaf or hard of hearing to communicate with their friends and family on video chats. In addition, the same feature can be used by anyone who wants to watch a video in public without having the sound of the video bother everyone around them.