7 min read
7 min read

Imagine taking off your new high-tech glasses and setting them on the nightstand, completely forgetting the camera is still rolling. According to recent reports, that exact scenario has played out, with incredibly private family moments being captured on tape.
These aren’t just home videos stored on your phone. An investigation has revealed that footage from Meta Ray-Ban Smart Glasses, including people undressing or using the bathroom, has been viewed by contract workers overseas who are hired to label data and train artificial intelligence.

When we think about artificial intelligence learning, we picture computers doing all the work automatically. But the truth is, companies like Meta often rely on thousands of real people to teach their systems what they are looking at.
These workers, many based in Kenya and employed by a subcontractor named Sama, manually review videos from Meta Ray-Ban Smart Glasses. They draw boxes around objects and tag scenes to help the AI recognize things like a dog or a sunset. Unfortunately, this means they also end up watching moments that were never meant for public viewing.

The job of a data annotator sounds technical and boring, but the reality for those reviewing footage from Meta Ray-Ban Smart Glasses is often shocking. Workers have described seeing extremely personal content that ranges from financial information, like credit cards, to intimate moments between couples.
One worker told Swedish newspapers that they saw a man leave his glasses on a bedside table, only to have his wife enter and change clothes moments later. The annotators say they feel trapped, knowing they have to watch these private moments or risk losing their jobs if they refuse.

Most of us are guilty of scrolling past the terms of service agreement and hitting accept without a second thought. That is where Meta has buried the detail that humans might be watching your recordings.
The company’s AI terms state that they may review your interactions manually or automatically to improve their services. While the warning is technically there, critics argue that nobody expects manual review to mean a stranger in another country watching footage of their living room or bedroom.

You might be wondering if you can just turn off the sharing feature and keep using the glasses for fun filters and hands-free photos. Unfortunately, it is not that simple if you want to use the latest artificial intelligence tools.
To use the Live AI features that answer questions about what you are seeing, your glasses have to send that video to Meta’s servers. You cannot pick and choose which parts of your life get analyzed. If you want the cool tech, you have to accept that your surroundings might end up in front of a contractor’s screen.
Fun fact: Meta Ray-Ban glasses sales tripled in 2025, with more than 7 million units sold worldwide.

When asked about these troubling reports, Meta defended its practices by explaining that they have safety measures in place. The company says that before any human contractor sees footage, it is run through filters designed to blur faces and block identifying information.
However, sources familiar with the process told investigators that these filters do not always work. Difficult lighting or odd angles can mean that faces slip through the cracks, leaving the very people the filters were meant to protect fully visible to reviewers.

Imagine sitting in a cubicle and having to watch videos of strangers in their most vulnerable states, day after day. For the contractors in Nairobi, this is just another shift, and they have no choice but to power through the footage.
Workers reported that questioning the content or refusing to label a video is simply not allowed. If you start asking questions, you are gone,” one employee told the press. They are expected to annotate everything, whether it is a peaceful living room scene or a highly intimate encounter.

The recent investigation has caught the attention of government officials who are supposed to protect citizens’ data. The UK’s Information Commissioner’s Office has stated that the claims are concerning and that they are officially writing to Meta for answers.
The watchdog wants to know exactly how the company is meeting its legal obligations to be transparent with users. They emphasized that tech companies must clearly explain what data is collected and how it is used, something critics say Meta has failed to do adequately.

It did not take long for the legal system to react to these privacy bombshells. A class-action lawsuit has already been filed in federal court in San Francisco, accusing Meta of false advertising.
The people suing claim they bought the glasses relying on Meta’s marketing about privacy features. They argue that if they had known human contractors were reviewing footage of nudity, credit cards, and private acts, they never would have spent their money on the product.

The Meta glasses do have a small LED light on the frame that turns on when you are recording. This is supposed to alert people around you that they are on camera, creating a sense of safety and consent.
However, reports suggest that this light is easy to miss, and people often forget it is even there. The person taking off the glasses to go to the bathroom certainly is not looking at the frames to check if the light is off, leaving them completely unaware that their private moments are being uploaded.

While this specific story is about smart glasses, the practice of using cheap overseas labor to review user data is an open secret in Silicon Valley. Other tech giants have faced similar scandals regarding their voice assistants and content moderation.
Years ago, Apple faced backlash when it was revealed that contractors were listening to private Siri recordings, including conversations between doctors and patients. This history shows that the problem of human-in-the-loop AI training is widespread, and Meta is just the latest company to be caught in the spotlight.

Beyond the awkwardness of having a stranger watch your home videos, this news raises bigger questions about where technology is heading. We are essentially paying companies to turn our daily lives into training material for machines.
Privacy advocates worry that if we accept this now, it sets a dangerous standard for the future. As glasses and wearables become more common, the line between public life and private life blurs even further, making everyone a potential source of data for someone else’s AI model.
Curious where this is all heading? Take a quick look at how Meta is expanding its AI ambitions.

So, what is the takeaway for someone who owns these glasses or is thinking about buying a pair? It ultimately comes down to how you feel about your data and where it ends up.
If you own a pair and want to use the AI features, you have to accept that humans might see your footage. If that thought makes you uncomfortable, your only real option is to stick to using them as regular glasses or a simple camera, and keep the smart features turned firmly off.
Want to see how these AI decisions are already paying off for the company? Check out how Meta is leaving Microsoft behind with AI-driven ad gains.
Would you still wear smart glasses after learning this? Let us know in the comments, and don’t forget to leave a like if you found this helpful.
This slideshow was made with AI assistance and human editing.
Don’t forget to follow us for more exclusive content on MSN.
Read More From This Brand:
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Father, tech enthusiast, pilot and traveler. Trying to stay up to date with all of the latest and greatest tech trends that are shaping out daily lives.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!