Monday, December 23, 2024

Meta implements live AI, translations and Shazam in its shrewd glasses

Share

Just meta announced Ray-Ban shrewd glasses introduce three recent features: live artificial intelligence, live translations and Shazam. Both Live AI and Live Translation are available exclusively to Meta Early Access members, while Shazam support is available to all users in the US and Canada.

Both live AI and live translation were first demonstrated at Meta Connect 2024 earlier this year. Live AI enables a natural conversation with the Meta AI assistant, which constantly observes the surroundings. For example, if you’re browsing the produce section of a grocery store, you’d theoretically be able to ask Meta’s AI to suggest recipes based on the ingredients you’re browsing. Meta claims that users will be able to employ the live AI feature for approximately 30 minutes non-stop once fully charged.

Meanwhile, live translation enables the glasses to translate speech in real time between English and Spanish, French or Italian. You can listen to the translations through the glasses alone or view the transcripts on your phone. You need to download the language pairs in advance and also determine what language you speak compared to the language your interlocutor speaks.

Using Shazam is a bit easier. All you need to do is let Meta AI know when you hear a song and it should be able to tell you what you’re listening to. You can watch Meta CEO Mark Zuckerberg demonstrate it here Instagram reel.

If you don’t see these features yet, make sure your glasses are running software version 11 and are also using version 196 of Meta View. If you are not yet participating in the Early Access program, you can apply via this website.

The updates come as Massive Tech recognizes AI assistants as the backbone of shrewd glasses. Just last week, Google announced Android XR, a recent operating system for shrewd glasses, and specifically positioned its Gemini AI assistant as a killer app. Meanwhile, Meta technical director Andrew Bosworth I just published a blog opining that “2024 was the year AI glasses took off.” In it, Bosworth also claims that shrewd glasses may be the best possible form for a “truly AI-native device” and the first hardware category “fully defined by AI from the ground up.”

Latest Posts

More News