We were rather impressed when we reviewed the second generation Ray-Ban Meta Smart Glasses, which have steadily gotten better over time thanks to Meta AI. With the glasses, even something as mundane as grocery shopping can be transformed by the implementation of AI.
But the latest update is the most significant to date: three new updates, announced simultaneously at Meta Connect in September, make glasses more useful. However, two of the features are only available to those participating in the early access program, and all three are currently only available to those in the US and Canada.
As detailed on the Meta blog, the three upgrades will be available in v11 software updates. Available outside of Early Access is integration with Shazam.
If you find a Christmas song that is currently on repeat that you would like to hear more of, you can ask, “Hey Meta, what is this song?”
Again, this is only available in North America, so the rest of the world will have to rely on the Android and iOS apps.
Then there are two perks that require being a member of the Early Access program; Live AI is the first of these, which adds video to the glasses' Meta AI. Once activated, the Meta AI will be able to “see” what you are looking at and can converse spontaneously about what is happening before your eyes.
Meta thinks this could be very useful when your hands are full (think cooking or gardening) or when you are out and about. For example, you will be able to ask the Meta AI questions about what you are looking at, such as how you can prepare a meal from the many ingredients in front of you. There is a battery drain, however, and Meta suggests that a full charge will allow about 30 minutes of live AI use.
Still, this is an exciting development, and Meta forewarns that it will improve over time: “Eventually, the live AI will give useful suggestions at the right time, even before you ask,” the post reads.
Finally, and most exciting to me, is the live translation, which promises to help people understand a foreign language without trying to learn it. When enabled, if someone speaks to you in French, Italian, or Spanish, the translation will appear in real time through an open ear speaker or as text on your phone.
Of course, we've seen this sort of thing before. The first generation Pixel Buds tried this seven years ago, and Samsung's Galaxy AI does something similar with live calls. But given that Meta's Ray-Bans are designed to make technology almost invisible, this feels a bit more natural than both.
Again, these last two AI features require participation in an early access program. If you are in North America, you can sign up here, but the process is described as joining a waiting list, suggesting that acceptance is not guaranteed.
Still, these will eventually be rolled out to all users, and Meta hints that more are coming soon. The post concludes cryptically, “In 2025, we will be able to bring you more software updates and maybe even a surprise.”
Comments