Smartphones have revolutionized our existence in less than two decades, transforming us from simple phone users to digital cyborgs always connected. And what's next? The next big revolution is already on the horizon: AI glasses, devices to be worn on the nose and equipped with artificial intelligence.
Will this really be the future? And if it really does come, what will it be like? We have tried to imagine it, but it is not just fantasy.

Technological Convergence: When AI Meets Augmented Reality
The idea of smart glasses is not new, but the integration of generative AI is a complete game changer. While early attempts (such as the catastrophic seeds Google Glass) have encountered technological limitations and social resistance, today we are witnessing the convergence of several factors that could finally make this vision a reality.
The miniaturization of processors, the evolution of batteries, the advances in digital optics and above all the explosion of conversational artificial intelligence are creating the perfect conditions for a revolution that could be even more disruptive than that of smartphones.
The players

Mark Zuckerberg di Meta recently stated that AI Glasses Could Replace Smartphones by 2035, defining this process as something that “will take at least a decade”. Ray-Ban Meta Smart Glasses have already integrated live AI features, real-time translations and music recognition via Shazam, demonstrating that the technology is becoming ready.

Samsung and Google have announced a partnership to develop AR glasses based on Android XR, aiming to compete directly with future Orion of Meta. Apple, after the expensive Vision Pro, is developing more affordable smart glasses to launch by the end of 2025, while innovative startups like Brilliant Labs they already have Frame AI Glasses launched at $350.
Use Cases: A Day with AI Glasses
The morning: The intelligent awakening

It's 2037. Good morning! You wake up and, putting on your AI glasses, you immediately see the day's appointments, the weather conditions and the most relevant news personalized to your interests superimposed in your field of vision. The AI analyzes your tone of voice and instantly suggests an energizing or relaxing playlist depending on your mood.
AI Glasses at Work: Amplified Productivity

During meetings, AI glasses translate conversations in real time, provide contextual information about participants, take notes automatically, and even suggest relevant questions based on analysis of colleagues’ body language and tone of voice.
For technical professionals, they allow you to see schematics, data, and documents right in your field of view while working hands-free—a huge leap in productivity. A surgeon can access vital patient data, a mechanic can see manuals and repair tips overlaid right on the engine he's repairing.
In social life: enhanced connections

AI glasses could revolutionize social interactions. When meeting someone, you could discreetly see information about your common connections, shared interests, or even simple reminders of previous conversations. For those with social anxiety or who have trouble remembering names and faces, this is a liberating technology.
As we anticipated in our hypothetical “smart supermarket” scenario, imagine entering a store where a robot immediately recognizes you thanks to AI glasses, providing you with a personalized list based on your previous purchases, while your autonomous cart follows you suggesting complementary products.
In learning: immersive education

Walking through Rome, you might get detailed historical explanations of every monument you look at. Studying a foreign language, every object in your field of vision might show the translation. Children might learn math by seeing problems materialize in the air before them.
In shopping: “Ingenious” consumption

When you walk into a store, the glasses could automatically compare prices online, show product reviews, check compatibility with your existing devices, or suggest more sustainable alternatives. You could also virtually “try on” clothes by looking in the mirror, or see how a piece of furniture would look in your home through augmented reality.
Career Transformations: New Work Horizons in the Age of AI Glasses
Medicine and health
In the medical field, AI glasses could be an unprecedented revolution. Doctors could diagnose diseases by instantly analyzing visual symptoms, access real-time medical databases during visits, or guide surgeries with millimeter precision through augmented reality overlays.
For patients, the glasses could constantly monitor vital signs, remind them to take medications, and provide immediate assistance in medical emergencies by automatically alerting emergency services.
A concrete example is represented by the ENVision Glasses, which use AI to help blind and visually impaired people to “see” the world through detailed vocal descriptions, object recognition, text reading, and identifying familiar faces.
Education and training
Education could also be completely transformed. Students could explore ancient Pompeii by walking the streets of today, see chemical molecules dance before their eyes, or receive personalized AI tutoring while studying. Professional training could become fully immersive, allowing complex procedures to be practiced in simulated but realistic environments.
Industry and manufacturing
In factories, workers could receive step-by-step instructions overlaid directly on machines, view real-time performance data, and receive immediate alerts about potential hazards. Predictive maintenance would become more precise as AI constantly analyzes the health of machines through embedded sensors.
Art and creativity
Architects and designers could visualize their designs in full scale in physical space, modify them in real time, and collaborate with remote colleagues as if they were in the same room. Artists could paint in the air, sculptors could model virtual forms that are then materialized by 3D printers.
Social Impacts: The Lights and Shadows of the AI Glasses “Revolution”
Democratization of information and “new” human relations
AI glasses could democratize access to information in unimaginable ways. People with visual impairments could “see” through AI-generated audio descriptions, while those with hearing impairments could read real-time transcripts of every conversation.
Education could become accessible everywhere: a child in a rural area could receive the same quality of education as one in a metropolis, simply by looking at the world through intelligent lenses.
However, this technology raises profound questions about human relationships. If we can instantly access information about everyone we meet, what happens to spontaneous discovery and mystery in relationships? The risk is that we live in a bubble of AI-filtered information, losing the ability to form independent judgments about people.
Privacy and Surveillance: The Price of Connection
The issue of privacy becomes even more critical. Always-on AI glasses mean always-on cameras, always-listening microphones, and always-analyzing algorithms. Who controls this data? How is it used? The risk of unprecedented mass surveillance is real.
Furthermore, the ability to secretly record conversations and private scenes could create enormous social and legal tensions. We could see the emergence of “technology-free” spaces where smart glasses are banned.
Technical Challenges: The Obstacles AI Glasses Must Overcome
Autonomy and power supply
One of the biggest hurdles is battery life. Smartphones barely last a full day, and AI glasses would need to be even more energy efficient, given their small size and need to be always on.
Solutions could include solar cells integrated into lenses, continuous wireless charging, or innovative batteries based on new materials. Some companies are experimenting with kinetic charging, which uses head movements to generate power.
Data Processing and Latency
To function effectively, AI glasses must process massive amounts of visual and audio data in real time. This requires either very powerful processors in miniature spaces, or ultra-fast connections to the cloud for remote processing.
The 5G and the future 6G will be crucial, but even with fast connections, latency could be problematic for applications that require instantaneous responses.
User Interface and Control
How do you control a device that doesn’t have a keyboard or touchscreen? Methods could include voice commands, hand gestures, eye movements, or even neural interfaces. Each solution has advantages and disadvantages in terms of convenience, accuracy, and social acceptance. We’ll see in the coming years.

Economic Implications of AI Glasses: A New Digital Economy
The destruction of the smartphone market
If AI glasses were to completely replace smartphones, we would witness the greatest technological disruption since the birth of the Internet. Meta is explicitly aiming for this goal to free itself from dependence on third-party platforms, such as app stores that impose 30% commissions on app profits.
The strategy also reflects hard economic data: during Meta's latest earnings call, Zuckerberg announced quarterly revenue of $48 billion, with the Reality Labs division generating $1 billion thanks to the success of the Meta Quest 3S and Ray-Ban Meta Smart Glasses, while still requiring massive investments.
New business models
AI glasses could create entirely new economies. Imagine stores that pay to appear in your visual searches, personalized AI tour guides, or “memory enhancement” services that help you remember (and literally recall to your eyes) every detail of your life.
Advertising could become even more intrusive and personalized, but also more useful and contextual. Instead of banners on websites, you could see offers overlaid directly on the products you’re looking at in the real world.
Many jobs could be radically transformed or eliminated. Tour guides could be replaced by AI, but new professions could emerge such as “experience designers” who create content for augmented reality, or “AI trainers” who specialize in teaching artificial intelligences how to interact naturally with humans.
Ethical Considerations: Navigating the Future Responsibly
Digital Equity
As with any new technology, there is a risk that AI glasses will amplify existing inequalities. Those who cannot afford them may find themselves disadvantaged in a world where access to real-time information becomes essential for social and economic participation.
It will be crucial to develop accessible versions of this technology and ensure that the benefits are distributed equitably across society.
The authenticity of the experience
In a world where AI can filter, modify, and interpret everything we see, what remains authentic? Are we at risk of losing touch with “unmediated” reality, constantly living in an edited version of the world?
It may be necessary (and I bet it will happen) to develop a new form of “digital hygiene” that includes moments of voluntary disconnection from AI “filters.”
Manipulation and control
AI glasses could become incredibly powerful tools of manipulation. Authoritarian governments could use them to control the information citizens receive, while companies could manipulate perceptions to influence purchasing behavior.
Regulating this technology will be critical, but also incredibly complex given the global and rapidly evolving nature of the technology sector.
The Time Factor of AI Glasses: When Will It Become Reality?
Experts' predictions
The signs are increasingly concrete. Meta has established a precise roadmap: first generation in 2024 (with the first Ray-Ban Meta), second version in 2026, and third in 2028 which should mark the definitive affirmation of the technology. The internal project “Nazare” includes standalone AR glasses with a complete augmented reality experience, 3D graphics, a 70-degree field of view and a weight of about 100 grams.
Samsung and Google have announced that their AR glasses, based on Android XR, could arrive in the second half of 2025. Apple is working on prototypes that should be ready by the end of 2025, with commercial launch planned for 2026.
Concrete projects under development
Meta Orion: Meta's most advanced AR glasses, described by CTO Andrew Bosworth as “the most advanced piece of technology on the planet in its domain.” They will enable holographic calls and fully immersive interactions.
Project Moohan: Samsung's headset developed with Google that acts as a bridge to future AR glasses, combining features of the Meta Quest 3 and Vision Pro.
Android XR: Google's platform designed specifically for headsets and glasses, which will support Gemini Assistant for natural conversations and device control.
Qualcomm: The chipmaker is collaborating with Samsung and Google to create “companion” glasses for smartphones, focusing on a design indistinguishable from normal sunglasses.
But will it really happen? The harbingers of the era of AI glasses
As mentioned, there are signs everywhere. To learn more about those already mentioned: Meta has invested over a billion dollars in the development of AR glasses and its Ray-Ban Meta Smart Glasses Already Show Surprising Conversational AI Capabilities, real-time visual analytics and integration with services like Shazam. Apple is developing prototypes after the Vision Pro experience, Google unveiled Project Astra with AI glasses at Google I/O 2024, and startups like the aforementioned Brilliant Labs are democratizing access with affordable devices.
DARPA is working to miniaturize night vision goggles to the size of normal glasses, while companies like EyeJets promise to project information directly onto the retina, eliminating the need for external displays.
On the “weak signals” level, however, an interesting demographic factor is accelerating adoption: according to recent studies, approximately 3 billion people worldwide suffer from myopia, a number that could rise to 4 billion by 2035. Prolonged use of smartphones is considered one of the causes, paradoxically creating a perfect market for smart glasses.
Setting the Stage: How to Adapt to Changing AI Glasses
For the “common people”
How can we prepare for this revolution? First, by familiarizing ourselves with current augmented reality and AI technologies. Experimenting with voice assistants, AR apps on smartphones, and AI tools can help us understand the potential and limitations of these technologies.
It is also important to develop critical thinking about the information we receive digitally, a skill that will become even more crucial in a world of constant augmented reality.
For companies
Companies should start thinking about how their products and services could be reimagined for a world of AI glasses. This could mean developing new user interfaces, rethinking marketing strategies, or creating entirely new business models.
For the company
At a societal level, we need to start serious conversations about regulation, ethics, and fairness. These discussions need to involve not just technologists and policymakers, but society as a whole, because the decisions we make today will determine how this technology shapes our future.
AI Glasses, Towards an Augmented Future
Unless the next gadget from Sam Altman and Sir Jony Ive defines the standard and form factor for decades to come (a “necklace”? A pin?) AI-powered glasses are potentially the next great technological frontier, with the power to radically transform the way we interact with the world, work, learn, and connect with each other.
The concrete projects of Meta, Samsung-Google, Apple and other players in the sector demonstrate that these are no longer futuristic fantasies, but ongoing technological developments, as I told you in other articles. like this.
The prophetic words of Isaac Asimov on the convergence between human and machine, recently recalled, seem to materialize before our eyes: “Robots will become organic while humans will transform into machines.” AI glasses could represent exactly this point of convergence.
Like any technological revolution, they will bring extraordinary benefits but also significant challenges. The key will be to develop this technology responsibly, with a careful eye on the ethical, social and economic implications.
The future could literally be before our eyes, filtered through intelligent lenses that understand the world better than we do ourselves. The question is not if it will happen, but when and how. And most importantly, how can we ensure that this revolution serves humanity as a whole, without amplifying existing divisions.
We'll see. Literally.
AI Glasses, More Sources and Insights
Current projects and developments:
- Meta and the future of AI glasses – HDblog
- Samsung and Google Partnership for AR Glasses – Digitech News
- Apple's Smart Glasses Strategy – AI Magazine
- Android XR Platform – Google Blog
Technologies and innovations:
- Brilliant Labs Frame AI Glasses – HDblog
- ENVision Glasses for the blind – Vision Dept
- Qualcomm Partnership - CNBC
Future scenarios and analysis:
- 6 Future Scenarios of AI – Near Future
- Meta AR glasses roadmap – Near Future
- Asimov's Predictions – Near Future