7 min read
7 min read

Meta Connect 2025 promised a big reveal for Meta’s new smart glasses, including an upgraded Ray-Ban Meta, the wristband-controlled Ray-Ban Display, and the Oakley Meta Vanguard.
But the live demos didn’t go as planned. On stage, cooking influencer Jack Mancuso asked the glasses a simple recipe question. The AI skipped ahead, claiming base ingredients were already combined, and the demo stalled.
Even a WhatsApp video call between CTO Andrew Bosworth and CEO Mark Zuckerberg was disrupted; a bug prevented the notification from appearing, so the call couldn’t be accepted on the glasses.

Andrew Bosworth later explained on Instagram that the issue wasn’t Wi-Fi at all. When Mancuso triggered “Live AI,” it started the feature on every single Ray-Ban Meta in the building, not just the demo unit. Meta’s development server, only set up to handle a small demo load, was overwhelmed.
Bosworth called it a self-inflicted DDoS, meaning the server was flooded with too much traffic. It wasn’t a network issue; it was a planning mistake. This technical misstep highlights how scaling demos in real environments can expose hidden weaknesses.

The failed WhatsApp video call had a different cause. The glasses went to sleep at the exact moment the call came in. When Zuckerberg woke the display, the notification didn’t appear. Bosworth described it as a “race condition” bug, a timing issue that caused unpredictable results.
Although it was a first-time bug, Bosworth reassured that it has now been fixed. Meta knows its glasses can handle video calls. The problem was the live demo timing, showing that even small software quirks can be amplified under pressure.

Live product demos are notoriously difficult. Bosworth noted that rehearsals don’t always match real-life conditions, especially with many devices running simultaneously. A system that works perfectly in practice can fail spectacularly on stage.
Meta’s example shows that tech companies must carefully plan demos, including server load, device count, and network routing, to avoid public mishaps.

One key error came from triggering all the Ray-Ban Meta glasses at once. The AI command wasn’t isolated to the demo glasses, flooding Meta’s server. This unexpected “all hands on deck” activation caused the AI to misbehave.
It’s a rare situation, but it illustrates how scaling software for live events requires precise resource management. Every device counts, and unexpected interactions can break even well-tested systems.

Bosworth described the situation as a self-inflicted DDoS attack. The development server couldn’t handle the traffic from all the building’s glasses. Meta had planned for only a small number of devices, and the overload caused the demo failures.
This incident is a reminder that even controlled environments can be unpredictable. Planning for contingencies is crucial when demonstrating cutting-edge tech in front of live audiences.

In rehearsals, fewer devices were active, so the issue didn’t appear. Only during the live event, with every device in use, did the overload happen. Bosworth emphasized that unexpected conditions reveal weaknesses that rehearsals can’t always simulate.
Tech companies face a tough balance: rehearsed perfection versus real-world demonstration. Meta’s failure shows that even top engineers can’t anticipate every variable.

The glitches were embarrassing, but they also gave the demo an unintended kind of honesty. Instead of a carefully edited highlight reel, the audience saw the product as it actually behaved onstage.
That rawness wasn’t by design, yet it offered a clearer picture of both the glasses’ potential and their current limits.

The WhatsApp bug highlights how small software issues can ruin a demo. A single race condition in timing caused a failure at a critical moment. Bosworth admitted it was a “terrible place” for a bug to show up.
The bug didn’t appear in rehearsal since fewer glasses were active during testing. Bosworth said this race-condition issue was fixed after the event.

Bosworth stressed that the failures didn’t mean the product itself was flawed. The glasses work as intended; the problem was the demo setup. Meta wants audiences to understand that glitches on stage don’t equal poor technology.
It’s an important distinction. Public perception can confuse demo mishaps with product reliability, but behind-the-scenes fixes keep the tech solid.

The way Meta routed Live AI traffic contributed to the demo failure. Instead of isolating the demo glasses, the traffic flowed through the building’s access points, affecting all devices. The result: server overload and malfunctioning AI.
This shows the importance of careful network architecture for connected devices, especially in high-profile live events.

Meta’s experience highlights the challenge of scaling tech for live audiences. Hundreds of devices interacting with a single server can create unintended consequences. Even a minor miscalculation can cascade into a visible failure.
For developers, this is a lesson in testing at scale, considering how multiple devices, network load, and timing interact in complex systems.

Smart glasses pack in many advanced features that make them harder to perfect. Any misstep in software, routing, or timing can produce visible failures.
Meta’s live demo issues demonstrate the challenge of coordinating multiple systems in real time, especially under the pressure of a live audience.
Because they’re not yet as common as phones, even small bugs or setup mistakes stand out more during public demos, making each glitch feel bigger than it really is.

Meta has now addressed both the DDoS issue and the WhatsApp bug. The company gained valuable insights into live event scaling and device management, which will improve future demos.
After the event, Meta fixed both issues. Bosworth emphasized that the issues were tied to how traffic was routed and how devices interacted at scale, not to flaws in the glasses themselves.
Curious how AI is changing the way we wear glasses? Check out how Meta is making smart glasses even smarter with AI.

Meta’s smart glasses have the tech and features they promise. The demo failures were setup and timing issues, not flaws in the product itself.
Live demos are unpredictable, and what works in rehearsal can still fail on stage. Bosworth explained that this is exactly what happened with Meta’s smart glasses event.
Will Apple really outpace Meta in the smart glasses race? See why Apple’s glasses may beat Meta’s Ray-Ban style to market in 2027.
Do you think these glitches will slow adoption, or are they just part of testing new tech? Share your thoughts in the comments, and hit like if you’re following the future of wearables.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!