What Will Happen In 2025
I've done a lot of these January 1st look forward posts in the 20+ years I've been blogging. I've used many different approaches. I sometimes talk big themes, like I did last year. I sometimes focus on just one thing. And sometimes I just make a bunch of predictions. I am going to do the latter approach today because I feel like it and it's so much fun. 1/ Apple and Google will leverage their existing market power to surpass OpenAI/ChatGPT in consumer AI prompts by the end of 2025. 2/ Waymo w...
Startup Mortality Rates
A friend of mine stopped by the USV office the other morning and asked me about startup mortality rates. Her business sells to startups a...
What Will Happen In 2024
As we enter 2024, the capital markets have found their footing and are moving higher. The Fed has taken interest rates as far as they want at this ti...
What Will Happen In 2025
I've done a lot of these January 1st look forward posts in the 20+ years I've been blogging. I've used many different approaches. I sometimes talk big themes, like I did last year. I sometimes focus on just one thing. And sometimes I just make a bunch of predictions. I am going to do the latter approach today because I feel like it and it's so much fun. 1/ Apple and Google will leverage their existing market power to surpass OpenAI/ChatGPT in consumer AI prompts by the end of 2025. 2/ Waymo w...
Startup Mortality Rates
A friend of mine stopped by the USV office the other morning and asked me about startup mortality rates. Her business sells to startups a...
What Will Happen In 2024
As we enter 2024, the capital markets have found their footing and are moving higher. The Fed has taken interest rates as far as they want at this ti...


I picked up my new Meta "Display" smartglasses a few days ago and have been playing with them a bit since.
The one feature that feels really important and powerful to me is "live captions."
Most people are familiar with closed captioning on TVs and in theaters, where the speech is translated into text and shown at the bottom of the screen.
The Display smart glasses have this feature on the lower portion of the lenses. You can turn it on to understand someone speaking your native language better and you can use it for real-time translation of foreign language speakers.
I tried to take a photo of live captions running in my Display glasses but could not figure out how to do that. So here is an image I took from Meta's marketing assets:

I have had moderate hearing loss for at least a decade and struggle to hear in loud environments (busy restaurants, large events, sports arenas, etc). I hear fine in most places but certain environments give me real problems.
I've tried traditional hearing aids a few times and they have not worked well for my issues. I am a personal investor in one startup making advanced AI powered hearing aids that are delivered in eyeglass frames. I am excited to try them when the first units are ready soon and will blog about them then.
But it is also possible that a solution for people like me is live captioning. I am already familiar with captioning and we use it frequently on our TV at home, even for english language TV shows. So the idea of using the same approach to deal with hearing loss is very interesting to me. I plan to take the Display glasses with me the next time I dine at a loud restaurant or big event and see how they work.
I am also excited about using the foreign language translation when we travel overseas. It won't help me speak back in a foreign language but it will certainly help me understand.
Like most technologies when they arrive, live captioning feels "early." The UX around turning it on and off is clunky. For it to work well, it needs to just know when I need it to come on and when I don't. The current version of live captioning only works if you look directly at the speaker. These issues and others make it suboptimal in its current form. But I am confident all of that will improve in time.
But experiencing live captioning in my smartglasses has been an "aha" moment for me. I think the possibilities of this technology are quite powerful and important.
I picked up my new Meta "Display" smartglasses a few days ago and have been playing with them a bit since.
The one feature that feels really important and powerful to me is "live captions."
Most people are familiar with closed captioning on TVs and in theaters, where the speech is translated into text and shown at the bottom of the screen.
The Display smart glasses have this feature on the lower portion of the lenses. You can turn it on to understand someone speaking your native language better and you can use it for real-time translation of foreign language speakers.
I tried to take a photo of live captions running in my Display glasses but could not figure out how to do that. So here is an image I took from Meta's marketing assets:

I have had moderate hearing loss for at least a decade and struggle to hear in loud environments (busy restaurants, large events, sports arenas, etc). I hear fine in most places but certain environments give me real problems.
I've tried traditional hearing aids a few times and they have not worked well for my issues. I am a personal investor in one startup making advanced AI powered hearing aids that are delivered in eyeglass frames. I am excited to try them when the first units are ready soon and will blog about them then.
But it is also possible that a solution for people like me is live captioning. I am already familiar with captioning and we use it frequently on our TV at home, even for english language TV shows. So the idea of using the same approach to deal with hearing loss is very interesting to me. I plan to take the Display glasses with me the next time I dine at a loud restaurant or big event and see how they work.
I am also excited about using the foreign language translation when we travel overseas. It won't help me speak back in a foreign language but it will certainly help me understand.
Like most technologies when they arrive, live captioning feels "early." The UX around turning it on and off is clunky. For it to work well, it needs to just know when I need it to come on and when I don't. The current version of live captioning only works if you look directly at the speaker. These issues and others make it suboptimal in its current form. But I am confident all of that will improve in time.
But experiencing live captioning in my smartglasses has been an "aha" moment for me. I think the possibilities of this technology are quite powerful and important.
Share Dialog
Share Dialog
12 comments
Hi Casters. I wrote today about my favorite feature of the Meta Display smart glasses, called Live Captions https://avc.xyz/live-captions
I really don’t understand smart folks like you that support decentralized tech/money willing to feed your life into centralized AI. It is literally the most dangerous aspect of AI and far more dangerous than centralized money. We have a very powerful decentralized solution and we’re looking for devs to join us. Here’s an intro/demo: https://www.youtube.com/live/ptt0yr9jWG4?si=mGZ-6RLx5BREfW1D
I do understand the power of the tech for your use case as you outlined in your post and I definitely love what is possible, but would love to see you encouraging devs to start decentralizing anything AI related.
Very interesting use case. Among many other things, I'd love to better understand what Japanese people are saying when I walk into a restaurant.
Hello,kindly text me please
Cool
Hi
Very cool
Hi
Hello
How are u doing today and how is everything going on there with you?. I'm bailey pirya 29 years old from Houston tx USA. I am new on here and I'm hoping to meet new people but since its my first time using the internet to meet people I'll just like to tell u some more about me and i hope that helps u know who i really am deep down.
Hi Fred how are you doing