For years I carried a spare Android phone in my purse so I could access its excellent auto-caption capability—at first via an app called Live Transcribe but later it was incorporated into the operating system. I always wondered when Apple would catch up, and it finally has, recently launching Live Captions (Beta) in iOS 16.
An excerpt from my article on this topic is below. Read the full post at FindHearing on HHTM.
Apple’s Live Captions Still in Beta
Like Android’s Live Transcribe, Live Captions provides auto-captioning for any audio content. This includes FaceTime calls, streaming video, or in-person conversations. So far, it is only available in English. When not in use, it retreats to the background, residing on the screen as a small circle you can move around with your finger to keep it out of the way. To use captioning, click on the circle and it will expand to a rectangle that displays a few lines of text. Expand it again to fill the full screen with text.
Performance is Mixed
I tried Live Captions in a variety of settings and so far, the performance has been mixed compared to other speech-to-text solutions I have used. Live Captions had particular trouble decoding speech in background noise, but this is the case for most speech-to-text applications. I expect the accuracy and synchronicity will improve over time.
On FaceTime calls, the captions worked well, but only appeared when my conversation partner spoke which I found confusing since most captioning programs show all spoken speech.
Activating Live Caption is Easy
For more on how to activate Live Captions, continue reading on HHTM.