Meta's Ray-Ban glasses will now suggest you clothes
The company announced on Tuesday that new AI-vision features are now available in beta
Meta's long-anticipated AI-powered vision features for the Ray-Ban Smart Glasses are finally arriving, enabling the onboard AI to perceive its surroundings through the glasses' 12 MP camera and microphone.
The company announced on Tuesday that these new AI-vision features are now available in beta. While having a mobile AI capable of seeing and hearing offers significant utility, Meta's leadership chose to showcase the new feature in what can be described as an awkward and clichéd manner, reports Gizmodo.
Meta's CEO and frontman, Mark Zuckerberg, is not renowned for his fashion sense. Unlike Steve Jobs' iconic black turtlenecks or Sam Bankman-Fried's daily shorts and sleep shirt ensemble, Zuckerberg's attire lacks any patented style. On Tuesday evening, Zuckerberg opted to highlight the upcoming AI capabilities of the Smart Glasses by bringing them into his cluttered closet. He presented a large navy blue polo shirt adorned with throwback rainbow stripes.
In a moment that might be considered one of the most trivial questions to pose to vision-enabled AI since ChatGPT introduced a similar feature in September, Zuckerberg asked his glasses, "Hey Meta, look and tell me what pants to wear with this shirt." Meta's chatbot responded with a robotic "It seems to be a striped shirt," followed by a recommendation for dark-washed jeans or solid-colored pants. If faced with the task of coordinating an outfit with that gaudy vintage polo and questioning what complements rainbow stripes, one might find the AI's answer remarkably unhelpful.
Zuckerberg also shared another video on his Instagram, demonstrating how the AI could translate a meme from Spanish to English. Although the AI voice still sounds somewhat stilted and robotic, Meta has introduced celebrity-voiced AI personas for users to interact with on Facebook Messenger and WhatsApp. Consequently, Meta is likely to continue refining the feature to make it sound more human.
One of the most valuable additions to the Ray-Ban glasses is their ability to access real-time information through Bing AI. Meta announced that real-time search would roll out gradually to all US users in the near future. While Gizmodo has not yet tested these new features, Meta suggested several potential uses for the vision-enabled AI, such as generating captions for photos taken during a hike or describing an object one is holding. Although vision-based AI has more profound applications, such as providing real-world narration for blind or low-vision users, Meta appears to be taking a more modest approach with this initial beta release.
The company initially introduced this AI integration during its last Meta Connect conference, where it also showcased its Quest 3 VR headset. Previously, the onboard microphone could pick up simple commands, allowing users to interact with Meta's conversational AI chatbot. While I don't claim to have a keen sense of fashion, it appears that the AI's vague style advice might not be overly helpful for me or even Zuckerberg himself.