
Recently, Meta’s Q1 2025 earnings call was released to the public, revealing key insights into its operations, just as the firm moves towards AI and AR innovations. As Google rises as a wildcard new leader in XR, with its emerging AndriodXR framework, it appears the Google Glass promise is returning in a new form.
With other technology leaders attempting to take hold of the AR smart glasses market, long-standing leader and mainstream stalwart Meta is seemingly keen to retain control of the XR market. The firm is particularly interested in AR smart glasses, a space in which it is experiencing great success thanks to the Meta Ray-Ban partnership.
In its recent earnings call, the importance of the Meta smart glasses mission was on full show, with Mark Zuckerberg, Meta CEO, noting:
Ray-Ban Meta AI glasses have tripled in sales in the last year and people who have them are using them a lot. We’ve got some exciting new launches with our partner EssilorLuxottica later this year as well that should expand that category and add some new technological capabilities to the glasses.
However, despite success and Meta optimism, Susan Li, CFO, noted that within the Meta Reality Labs segment, “Q1 revenue was $412 million, down 6% year-over-year due to lower Meta Quest sales, which were partially offset by increased sales of Ray-Ban Meta AI glasses.”
Li also noted that Reality Labs expenses were $4.6 billion, “up 8% year-over-year, driven primarily by higher employee compensation.” Meanwhile, Reality Labs’ operating loss came in at $4.2 billion; historically, the Reality Labs segment loses money, but this appears to be a long-term investment in the future of computing and Meta’s innovation goals.
Li added:
Moving to Reality Labs. We’re seeing very strong traction with Ray-Ban Meta AI glasses, with over 4x as many monthly actives as a year ago, and the number of people using voice commands is growing even faster as people use it to answer questions and control their glasses. This month, we fully rolled out live translations on Ray-Ban Meta AI glasses to all markets for English, French, Italian, and Spanish. Now, when you are speaking to someone in one of these languages, you’ll hear what they say in your preferred language through the glasses in real time.
Meta, Smart Glasses, and Moving the XR Needle
Ray-Ban Meta smart glasses received a big, long-awaited AI upgrade in the UK. The device continues to prove popular as the market matures. However, at Meta Connect 2024, Zuckerberg also unveiled a prototype for the much-hyped Orion device, a pair of experimental AR smart glasses.
During the event, Zuckerberg called the device “the most advanced glasses in the world and our first prototype of full holographic AR.” Zuckerberg also unveiled a significant breakthrough: neural interface interaction. Orion allows, via a wrist device, users to interact with smart glasses using their brain’s electrical activity.
Following the reveal, CTO Andrew Bosworth dropped an interview explaining Orion’s journey; he stated:
We just revealed Orion, our AR glass prototype. Building full AR glasses like these with a wide field of view display, wireless AI built-in, running all the core experiences you’d expect, has been something we’ve been working on for nearly a decade now. When we first started this project, teams crunched the numbers, and they thought our chances of building one of these were 10%. So we overcame a lot of odds to build these glasses.
This device is still years away. However, integrating Orion research into Ray-Ban smart glasses and mixed reality enterprise solutions is a significant area of interest for avid XR followers.
In March, two more leaders from Meta discussed the future of AR smart glasses, a field where Meta aims to take the lead with its current products and prototypes. Vice President Alex Himel emphasised the significance of this technology in their media appearances.
Rabkin said that Meta covers a spectrum of XR devices, from AR Ray-Ban smart glasses to MR Quest headsets, which cover a “whole spectrum of devices to help people augment their life and improve their life in the best way possible, given what they need.”
Himel also added:
The thing about AI right now is that it feels like something that is infinitely capable, but people often don’t know where to start – it’s an empty text box or a microphone you can say anything into and it’s like, ‘where do I start?.’ The announcements we had last year are really meant to take specific use cases and make them work great – for example you can look at a phone number and say, call it. AI is probably the thing I’m most excited about, it has the ability to open up a session with AI where you’re constantly chatting back and forth and it’s constantly aware of what you’re looking at. I feel like the potential of AI is pretty huge and we’re excited to really lean into it.
Himel continued “when we talk about augmented reality, we talk about looking at the physical world and seeing digital stuff on top of that, whether that’s digital television instead of having to buy a real one, whether that’s a virtual picture frame on your desk, and we showed that it was possible – which I think is a major milestone.”
There will be a lot of change in the XR space leading up to 2026. With those changes, a shift in industry leaders may also emerge. As smart glasses arise, a new race to the head of the XR pack also emerges.