8 min read
8 min read

Imagine trusting an app with deeply personal health details only to learn it was secretly shared without asking. That is what a California jury says happened with Meta and a widely used period tracking app people believed was safe.
The decision shook millions who thought their most sensitive information was protected. It is a reminder that even well-known companies can break trust when private data is collected, stored, and then exposed in ways the user never agreed to or expected.

Flo Health launched in 2015 with a promise to help people track menstrual cycles. Over the years, it has expanded, offering tools for monitoring symptoms, moods, fertility goals, and pregnancy planning, and has quickly become a top-downloaded reproductive health app worldwide.
Its clean design and detailed tracking features encouraged people to enter very personal information. Many saw Flo as a private health companion, never imagining the possibility that such intimate details could be accessed, stored, and then shared beyond the app itself.

Court records zeroed in on the time between November 2016 and February 2019. During these years, sensitive user actions inside the Flo app were allegedly collected and shared without explicit permission, sparking major legal and ethical concerns across the industry.
This information included when the app was opened and which features were used. According to allegations, these activities were recorded and passed on to outside companies, raising questions about how digital tools handle the personal details entrusted to them on a daily basis.

In 2021, a group of Flo users filed a class action lawsuit against several technology companies. They claimed their private health information had been collected for targeted advertising, despite assurances that such sensitive data would remain confidential and fully protected.
The case originally targeted Flo Health, Google, Meta, and analytics firm Flurry. Settlements removed the other companies, leaving Meta as the sole defendant when the trial began, ensuring public attention focused squarely on its role in the controversy.

On August 1, 2025, jurors decided that there was clear evidence showing that Meta intentionally listened to in-app communications without the knowledge of Flo users.
They ruled Meta violated California’s Invasion of Privacy Act (CIPA § 632), the state’s ‘wiretap law’ protecting private conversations from electronic interception.
Under California’s wiretap law, each violation can carry a five-thousand-dollar penalty. With millions of potential instances involved, the final financial damages could reach staggering amounts, sending an unmistakable warning to companies that handle sensitive personal information.

Flo’s privacy policy told users that deeply personal details like menstrual cycles, pregnancy notes, and symptoms would stay private. It also claimed that any necessary sharing of data would be extremely limited and related only to running the app effectively.
The lawsuit argued these promises were ignored. Instead, the app allegedly sent highly intimate information to outside companies, giving them access to details that users had never imagined leaving the secure environment of their private health tracker.

Before the trial reached its final stages, Flo Health, Google, and Flurry all chose to settle with the plaintiffs. These settlements were resolved privately, with no public release of the financial terms or specific conditions agreed upon by the companies involved.
Meta, however, did not settle. As the only defendant left, it faced the full weight of the jury’s decision, making the outcome a landmark moment in the ongoing debate over personal data and digital privacy in the health technology sector.

Following the verdict, Meta publicly rejected the jury’s findings, stating it strongly disagreed with the decision. The company insisted that the claims against it were false and that it had never sought to obtain or use sensitive personal health information from its users.
Meta further explained that its developer terms forbid apps from sending such data. It argued that any sharing of reproductive health details was against its stated policies, maintaining its position that it acted within privacy and compliance rules.

For millions of people, Flo was a trusted health partner. It was a place to log private thoughts about their bodies, track symptoms, and note significant life events, all in a space they believed was secure and inaccessible to outsiders.
Learning that this information might have been viewed or shared without permission left many feeling shocked and betrayed. The breach of trust carried emotional weight, as these details were far more personal than typical online activity or browsing history.

By November 2024, Flo had surpassed 380 million downloads globally and maintained over 70 million monthly active users. Many opened the app daily, providing detailed updates that created a large and valuable database of reproductive health information across different ages and locations.
That scale meant any improper use of the data could affect millions at once. The sheer volume of information made the case not just a personal privacy matter but also a significant event in the tech industry’s handling of user trust.

Long before this verdict, the Federal Trade Commission had already taken an interest in Flo Health’s privacy practices. In 2021, it reached a settlement requiring the company to follow stricter rules and undergo independent reviews of how it managed user data.
The agreement also ordered Flo to stop making misleading statements about its data protection policies. This move signaled that the government was watching closely, especially when companies handle sensitive information in apps used by millions of people.

One Flo user, Erica Frasco, became a key figure in the lawsuit. She had used the app since 2017, answering highly personal questions about her health, emotions, and relationships, trusting it to keep her most private details away from outside parties.
Her decision to join the legal fight brought important attention to the risks of sharing sensitive health data. It showed how one person’s action can lead to broader awareness and potentially influence change across an entire industry.

This case forces many to think about how safe their digital information really is. If one of the most downloaded health apps faced accusations like these, the problem may extend further across apps and online services used every day.
People are now reconsidering what kinds of personal details they share online. The situation has started discussions on how technology companies should handle data that is intimate, unique, and potentially harmful if exposed or misused without consent.
The timing of this verdict is significant. Public discussions around reproductive health data have intensified across the United States in recent years, adding urgency to concerns about how such information might be stored, shared, or used by various entities.
Some fear that data from health apps could one day be used in legal matters, highlighting the importance of keeping this information secure. For many, the case feels like a wake-up call to be more careful with personal details.

Health tracking apps promise valuable insights, helping users monitor their bodies and make informed decisions. They offer convenience and clarity, turning complex health patterns into understandable charts and helpful reminders that can fit seamlessly into daily life routines.
Yet the same technology creates a detailed record of personal life. When that information is accessed or shared without consent, it becomes a liability, reminding users that the convenience of digital tools can sometimes come with hidden risks.
If you’re curious about what’s next in medicine, explore these emerging technologies shaping the future of healthcare.

Meta has announced plans to explore all legal options, including the possibility of an appeal. That means the legal process could continue for months or even years before any final resolution is reached in this high-profile case.
Regardless of what happens next, the verdict has already sent a strong warning to the tech industry about handling private health information.
And if you’re curious about the legal battles shaking up big tech, check out $8 billion privacy lawsuit that pits Meta investors against Mark Zuckerberg.
Do you believe this will change how companies manage sensitive data in the future, or will nothing shift? How do you think cases like this will shape the future of data privacy? Share your thoughts in the comments.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!