2025 is the year when everyone is trying to make us wear glasses. But not sunglasses: smart, with a screen, with assistants, with AI inside. There's Meta with his Ray-Bans at half-service. There's Xreal (formerly Nreal), which has joined hands with Google to develop AR experiences on Android XR. And then there's the ghost of Samsung, which according to rumors is ready to enter the field at any moment. In the midst of all this excitement, Snap announced (with emphasis and very few details) that his next Specs AR, “light” and “immersive”, will arrive in 2026. The announcement was made during theAWE 2025, the international trade fair where everything, from badges to air conditioning, seems to mean “augmented reality”.
According to Snap, it will be a revolution in the way we experience technology. A new type of personal computer, that doesn't fit on the desk or even in the pocket, but on the nose.
Featherweights, heavy promises
According to the CEO Evan Spiegel, the new Specs will be the turning point. It is not yet known how much they weigh, what shape they have, whether they look more like glasses or an escape room visor, but one thing is certain: for Snap they represent the future of human computing. Thin. Wearable. Connected. And hopefully more elegant than the current version, which seems more designed for Halloween than a happy hour in the park.
In addition to being lightweight, they will also be intelligent. The integration with Gemini (Google's AI) and Chat GPT OpenAI promises “multimodal experiences.” In short: you look at something, you describe it, and they respond. Hopefully, correctly.

Snap, glasses that talk. Now we have to see if they listen too.
Snap is no stranger to ambition. Spiegel said in 2019 that smart glasses were “a decade away from mainstream.” Here we are in 2025, and something has changed. But not enough.
The competition is already up and running. Meta is working on a new generation of Ray-Ban, with an integrated display to show notifications, directions and messages in real time. In parallel, it is carrying on Project Orion, a project halfway between a visor and glasses, which combines the power of VR devices with the portability of sunglasses.
Google is also moving: its partnership with Xreal could bring to the market Android XR glasses in the next two years. Prototypes are in the testing phase, but the market is preparing.
“What are those?” — “They are AI”
Snap OS, the operating system that will power the new Specs, will be equipped with cameras and generative AI. The idea is to replicate (and perhaps surpass) what Meta is already attempting: an augmented reality capable of reacting to what you see. Looking at a pair of shoes and asking “What brand are these?” is one of the most banal examples, but also one of the most requested.
The promise is a visual assistant that interprets the environment, responds, suggests, sometimes translates. And maybe, if there's time, even tells you where to find a bathroom. The problem is that these experiences, so far, have oscillated between brilliant and frustrating. If Snap can really find the balance, it could take home a big piece of the market.
Snap, the future is light. But it must support the weight of reality.
Snap believes in it. Spiegel has been saying it for years. Users, at least the more tech-friendly ones, hope so. But so far, real AR glasses (thin, comfortable, usable every day) remain a mythological creature. Prototypes, even the most advanced ones, always seem on the verge of arriving. Then they arrive… halfway.
2026 could be the year. But Snap will have to prove that its Specs are not just light on words. Because the market is moving. And if you arrive late with your glasses, you risk finding the future already sitting at the table.