The Ray-Ban Meta smart glasses are about to get a serious upgrade, thanks to Meta's AI smarts finally learning to see and hear.
Meta, formerly known as Facebook, has announced some exciting new features for its Meta Ray-Ban smart glasses that will make them more useful and interactive. The company is testing a new 'multimodal' AI assistant that can respond to your queries based on what you see and hear through the glasses' camera and microphones.
Multimodal AI at play
The multimodal AI assistant can suggest outfits, translate text, caption images, and describe objects you point the glasses at. Meta CEO Mark Zuckerberg showed off some of these capabilities in an Instagram reel, asking the glasses to recommend pants that would go well with a shirt he was holding. The assistant gave him two options and described the shirt's colour and pattern.
As Meta CTO Andrew Bosworth demonstrated in another video, the assistant can also handle more common AI tasks, such as translation and summarisation. He also showed how the assistant could accurately identify a California-shaped wall art and provide some information about the state.
Use cases
Imagine this: you're at a chic boutique, holding a stunning emerald blouse but stumped on the perfect ts trousers pairing. There is no need to whip out your phone and scroll through endless Pinterest boards. Just whisper, 'Hey Meta, what trousers would rock with this?' Boom, your glasses scan the blouse, analyse its colour and style, and BAM! You get instant fashion advice, like a personal stylist whispering secrets in your ear.
Think it's just about clothes? Think again! These glasses are your eyes and ears to the world, powered by Meta's AI brain. Are you struggling to understand a menu in a foreign language? Ask your glasses to translate it. Lost in a museum? They can tell you about the exhibits you're looking at.
Even that funky wall art at your friend's place? Just point and ask, 'What is that supposed to be?' The glasses, like a tiny oracle, will decode the artistic mystery. But wait, there's more! Remember those cool captions you write for Instagram? These glasses can help you craft them on the fly, analyse your photos, and suggest witty one-liners.
Limited early access for now
Of course, this AI superpower is only for some. Meta's playing it cool with an early access test, limited to a select group of tech-savvy people in the US. But, the whispers of this feature are already spreading like wildfire.
Imagine a world where your glasses become your concierge, style guru, language interpreter, and on-the-go information hub. It's like having Siri, Alexa, and Google rolled into one, perched right on your nose.
The multimodal AI assistant is still in progress and has some limitations. It can only recognise what you see by taking a photo, which it then analyses on the cloud. After making a voice request, you must wait a few seconds to hear the response. You must also use specific voice commands to trigger the photo-taking and the query. For example, you have to say, 'Hey, Meta, take a look at this and...' followed by your question.
The photos and the responses are stored in the Meta View app on your phone, which you can access later. This can be useful for keeping a record of what you learnt or saw through the glasses. The multimodal AI assistant can be a handy tool for exploring the world, shopping, learning, or having fun with the Meta Ray-Ban smart glasses.