The Next Computer? Your Glasses!

Published on 18 April 2025 at 09:53

You're in a negotiation with your boss — maybe it’s about your salary. You’ve come prepared. You know your worth, your contributions, and what the market pays. But instead of a straight answer, the usual excuses arrive: “We’ve had a tough quarter,” “The budget’s tight,” or “Let’s revisit this after summer.” But something doesn’t sit right.

Right then, discreetly, your glasses flash a notification. Your AI assistant has just analyzed the financial report and quietly whispers: “Company surplus reported in the latest annual report.”
You smile, raise an eyebrow, and calmly ask: “Could you elaborate a bit on that, preferably with specific numbers?”

You’re no longer alone in the room. Welcome to the future. Welcome to the age of AI glasses.

Glasses That See More Than You Think

In his recent TED Talk, “The Next Computer? Your Glasses!”, Shahram Izadi shares a vision of smart glasses as our next big interface — not just to see emails or messages, but to truly understand and interact with the world around us. These aren’t devices to scroll news feeds — they’re intelligent companions, analyzing, guiding, and reacting in real time.

This is no longer science fiction. It’s already here — and evolving rapidly.

Meetings Where AI Listens

Picture a business meeting. Four people at a table, a PowerPoint on the screen, and someone dominating the conversation. But their claims sound... shaky. With AI-powered glasses, your invisible co-pilot listens too.
In real time, it checks facts, compares sources, and sends you subtle cues:

  • “Claim unsupported by recent market data.”

  • “Suggest referencing Q4 trends.”

You become the best-informed person in the room — without saying a word.

Digital Intuition

Imagine an AI that doesn’t just understand what is said — but how it's said. Your boss tells you, “We really value your work.” But your glasses pick up that the tone, facial expressions, and micro-behaviors don’t match the words.
A discreet message appears: “Emotional mismatch detected. Possible insincerity.”

This is digital intuition — a capability humans often rely on unconsciously, but which AI can now measure and confirm.

It doesn’t replace your gut feeling. It amplifies it.

Salary Negotiation 2.0

Back in that salary negotiation, your AI assistant doesn’t just listen — it strategizes.

It might suggest:

  • “Redirect the conversation toward future responsibilities to pivot from current budget constraints.”

  • “Mention performance benchmarks exceeded in last quarter.”

  • “Pull up salary comparison from industry database.”

You’re now negotiating with a silent but brilliant partner at your side — one who never forgets, and never flinches.

How Do These Glasses Actually Work?

Today’s AI glasses are lightweight (often under 40 grams), powered by voice commands, equipped with built-in displays, microphones, cameras, and real-time access to cloud-based AI. Examples include:

  • Ray-Ban Meta Glasses – camera and AI-powered insights.

  • XREAL Air – projection directly onto your retina.

  • Halliday Glasses – semantic and situational understanding.

They offer live translation, whisper coaching, environment scanning, and smart recommendations — all while keeping you in control.

When Someone “Man-Splains” or “Guessplains”

Imagine someone saying: “Everybody knows Gen Z doesn’t want to work full time.”
Your AI catches it instantly: “Statistical data suggests otherwise. SCB report shows majority seeking full-time roles. Shall I display the graph?”

You no longer need to challenge the loudest voice in the room alone. You have backup — instant, calm, factual backup.

Ethical Boundaries

Of course, this brave new world comes with big ethical questions:

  • Can you record or analyze people without consent?

  • How will these tools affect workplace trust?

  • What happens when everyone’s wearing invisible truth detectors?

There will need to be clear policies, laws, and cultural norms. But make no mistake — this technology is here to stay. And it will be a competitive advantage for those who use it with integrity.

AI as an Equalizer

One unexpected potential: AI glasses could be a huge step toward equality.

People who don’t speak up as easily in meetings, who are often interrupted, or struggle with confidence — now have an assistant who ensures their ideas are backed by evidence and delivered strategically.

AI doesn’t just make loud voices louder — it gives quiet brilliance a megaphone.

The Co-Pilot for Life

Smart glasses aren’t just for boardrooms. They’re becoming:

  • Career mentors – reminding you of goals, giving strategic prompts.

  • Social coaches – sensing mood shifts, offering empathy cues.

  • Well-being allies – recognizing stress and suggesting breathing exercises.

  • Learning companions – translating, defining, illustrating instantly.

  • Bridge-builders – translating languages and decoding cultures.

This is no longer a screen. It’s a shared mind — yours, enhanced.

From Tech to Trust

But here’s the catch: Will we trust something so close to our thinking?

We must. But wisely.

These tools should amplify our best selves — not replace them. Our human intuition, compassion, and ethics must stay at the core.

The real power of AI glasses lies not in what they see, but in how they help us see clearer — including ourselves.

Conclusion: The Glasses See – But You Choose

Next time you’re facing a difficult conversation, a complex meeting, or a personal decision — imagine not being alone.
Imagine a silent partner who sees through the fog, offers calm, neutral facts, and helps you breathe through the moment.

You’re still the pilot. But now you have a co-pilot who’s brilliant, tireless, and always has your back.

Maybe the greatest gift of these glasses isn’t what they show — but the confidence they give you to say what you’ve been thinking all along.

 

By Chris...


The next computer? Your glasses

Picture this: you’re wearing a normal-looking pair of glasses, but they give you the ability to quickly summarize a book, translate between languages or remember where you left your keys. In a live demo of unreleased technology, computer scientist Shahram Izadi unveils Google’s new Android XR platform, which aims to give users the power of AI via smart glasses and headsets. He’s joined onstage by two colleagues, giving a glimpse of the future of “extended reality” (XR) devices — smart, seamless and right before your eyes.

 

Link: Tedtalk - Shahram Izadi


Add comment

Comments

There are no comments yet.