# Live Captions

By [AVC](https://avc.xyz) · 2026-01-08

---

I picked up my new Meta "Display" smartglasses a few days ago and have been playing with them a bit since.

The one feature that feels really important and powerful to me is "live captions."

Most people are familiar with [closed captioning](https://en.wikipedia.org/wiki/Closed_captioning) on TVs and in theaters, where the speech is translated into text and shown at the bottom of the screen.

The Display smart glasses have this feature on the lower portion of the lenses. You can turn it on to understand someone speaking your native language better and you can use it for real-time translation of foreign language speakers.

I tried to take a photo of live captions running in my Display glasses but could not figure out how to do that. So here is an image I took from Meta's marketing assets:

![](https://storage.googleapis.com/papyrus_images/39b23a4c0b7e6826e4c3089d33257f9aa2ba01b89dfbe0743323ef04e40fc572.webp)

I have had moderate hearing loss for at least a decade and struggle to hear in loud environments (busy restaurants, large events, sports arenas, etc). I hear fine in most places but certain environments give me real problems.

I've tried traditional hearing aids a few times and they have not worked well for my issues. I am a personal investor in one startup making advanced AI powered hearing aids that are delivered in eyeglass frames. I am excited to try them when the first units are ready soon and will blog about them then.

But it is also possible that a solution for people like me is live captioning. I am already familiar with captioning and we use it frequently on our TV at home, even for english language TV shows. So the idea of using the same approach to deal with hearing loss is very interesting to me. I plan to take the Display glasses with me the next time I dine at a loud restaurant or big event and see how they work.

I am also excited about using the foreign language translation when we travel overseas. It won't help me speak back in a foreign language but it will certainly help me understand.

Like most technologies when they arrive, live captioning feels "early." The UX around turning it on and off is clunky. For it to work well, it needs to just know when I need it to come on and when I don't. The current version of live captioning only works if you look directly at the speaker. These issues and others make it suboptimal in its current form. But I am confident all of that will improve in time.

But experiencing live captioning in my smartglasses has been an "aha" moment for me. I think the possibilities of this technology are quite powerful and important.

---

*Originally published on [AVC](https://avc.xyz/live-captions)*
