One of the big themes of the Google I/O 2019 opening keynote is inclusivity. A new feature in Android Q aims to improve inclusivity for persons who are deaf and hard of hearing by offering instant captions to just about any audio or video played on a phone.
Called Live Captions, the feature employs AI to translate speech played back on a smartphone to fast, accurate captions. The beauty of it is the feature works with any app, regardless if it plays audio or video, and regardless if the content is streamed from a server, played from local storage, or generated on the fly by a human.
Live Captions works with podcasts, videos, audio, and video chat apps like Duo. The demo we saw on the stage of the Google I/O keynote seemed very smooth and impressive, though obviously real world results may vary.
Live Captions will be accessible with one tap – users will be able to activate it by clicking on a new icon visible when changing the system volume. Everything is processed locally, meaning you won't need to worry about third-parties listening in on your conversations.
Captions are shown in a black window overlaid on top of the normal interface. The captions are not saved for later, so you will only see them when the corresponding audio is played.
While deaf people may benefit the most of this cool new feature, Live Caption has the potential to be useful for lots of other users, in a variety of situations. It even works when audio is turned down to zero, allowing users to consume content without disturbing anyone around.
Live Captions is a new accessibility feature baked into Android Q. You'll need to enable it from the settings before using it and it's not clear for now whether the feature will be included by all OEMs in their Android Q devices.
Stay tuned for more from Google I/O.
No comments:
Post a Comment