Chris began by introducing Meta’s flagship wearable, the Ray-Ban Meta AI glasses. “I’m super excited about that. As a matter of fact, I’m just wearing one of them,” he said. Designed in collaboration with Ray-Ban, these glasses combine fashion with functionality.
“Obviously the Ray-Ban brand is really good from a brand perspective… It looks fashionable, doesn’t make me look geeky, and is power-packed with a lot of technology,” Chris explained.
The glasses come equipped with a 12-megapixel camera, built-in microphones, and speakers, allowing users to take pictures, listen to audio, and interact with the device without the need for headphones or a phone. But the real magic lies in the AI.
The glasses incorporate what Chris calls multimodal AI. “The reason why we call it multimodal AI is because it uses various inputs and output mechanisms. So, it uses the camera to see things around, it uses the mic to hear things around and then it gives you contextual information.”
One of the standout features? Live translation. “I don’t know Spanish and I just recently went to Mexico with my family. I just turned on my live translation. That person was talking to me in Spanish, and the glasses were translating the Spanish language to me in English,” Chris shared. The glasses can also translate text in real time, such as menus, by simply looking and asking what it says.
He believes wearables like smart glasses are the next frontier for AI because they stay with users constantly, allowing for real-time, contextual interaction. From identifying historical monuments to helping users navigate foreign environments, the glasses are built to be a constant, intelligent companion.
Currently, Meta is not serving ads through the glasses, but the marketing potential is clear. “Our vision is we want to breadcrumb consumers through this journey… At this point of time, there is no marketing element to it. However, in the future… it being contextual, having location-based information… providing recommendation relevant at the right time, at the right place.”
Chris described a future where AI agents integrated into wearables could make real-time recommendations or even book travel and rides. In future possibilities exist where the glasses can interact with installed apps via voice commands. For example, getting a ride would be as easy as: “You can say, hey Meta, I’m over here. It knows my location. Call me an Uber… it just interfaces with the app all in the background, calls an Uber and there I have an Uber in front of me.”
Interaction with the glasses is either primarily voice-based or through a tap on the temple arm. “Right now, the way I’m also interacting with you is a voice because I can ask, ‘Hey, Meta, take a picture.’ Or let’s say, ‘Hey, Meta, what’s the weather? What’s the location?’ It has got mic, so it can hear me.”
Meta’s ambitions go beyond smart glasses. Chris shared that the company is also investing in mixed reality headsets. “We don’t even call it VR. It’s not VR anymore. It’s mixed reality… It has evolved… you can see through things around you and then you can overlay objects in real time.”
He explained that while VR started with immersive gaming, the new generation of devices blends virtual and real worlds. Meta’s vision, as showcased with Orion last year at Connect 2024, is to integrate augmented reality features via a heads-up displays in the lens into its future glasses. “That glass will have a display, which means I can see maps and navigate, or I can see the picture I have taken. So, I don’t need a phone to pull up over here to even see the picture I just took”
Meta’s vision for wearables is bold, human-centered, and rooted in real-world utility. Whether helping a visually impaired person shop independently or translating a foreign menu in seconds, their technology is changing the way we interact with the world around us.
As Chris puts it, “It’s just about this whole constellation of wearables and the smart AI tech features in it.” And that constellation is only getting brighter.