7 min read
7 min read

You likely scroll through Facebook or TikTok daily without a second thought. European regulators are now taking a very close look at how these platforms operate behind the scenes.
On Oct. 24, the European Commission announced preliminary findings that TikTok and Meta may have breached transparency obligations under the Digital Services Act, a move that could force operational changes for both platforms.

The Digital Services Act (DSA) is EU legislation that imposes stricter duties on very large online platforms, including requirements on illegal content, transparency for researchers, and user-facing reporting and appeals tools, to improve safety and accountability online.
This law specifically targets the largest online platforms, including giants like Meta and TikTok. It sets strict standards for handling everything from illegal content to the data that independent researchers can access for studies.

A key problem is that researchers are being blocked from properly studying these platforms. The Commission said procedures for researchers to request access to public data were “burdensome,” and that excessive steps and demands can prevent robust, independent study of platform harms.
This blockage makes it incredibly hard to study serious risks, like how often young users see harmful content. We cannot fully understand the apps’ impact on mental health without robust, independent data analysis.

Allowing independent research is absolutely crucial for public safety. It provides an essential check on the immense power these platforms wield over public discourse and daily life. This outside scrutiny helps ensure companies are being truthful about their operations.
Such studies can reveal if users are being exposed to dangerous misinformation or illegal material. This research ultimately helps create a safer and more transparent online environment for everyone, especially vulnerable younger users.

Meta faces extra criticism for how users report bad content on Facebook and Instagram. Regulators state that the Notice and Action tools are not user-friendly or easily accessible for the average person. The process apparently involves too many confusing and unnecessary steps.
This complexity makes it harder for people to quickly flag serious problems like terrorist propaganda. An inefficient reporting system can allow clearly harmful content to stay active and circulate online for much longer.

The EU also accuses Meta of using dark patterns in its reporting flow. Dark patterns are deceptive design tricks that can subtly confuse or manipulate your choices online. They might make canceling a report easier than actually submitting one.
These intentionally confusing designs can discourage people from completing important reports. This results in less illegal content being formally flagged for review by the platform’s moderators.

Have you ever had a post removed and wanted to challenge that decision? The Commission found Meta’s appeal system is also fundamentally flawed. It apparently does not let users properly explain why they disagree with a content moderation call.
The system does not easily allow for providing supporting evidence or a detailed written explanation. This major limitation makes the entire appeals mechanism much less effective for regular people.
Unsurprisingly, Meta and TikTok disagree with the European Commission’s preliminary findings. A Meta spokesperson stated they have already made changes to comply and are confident they are following the law. They plan to continue negotiations with regulators.
TikTok highlighted its commitment to transparency and the hundreds of research teams it has worked with. The company also pointed out a potential legal tension between new transparency rules and existing privacy laws.

TikTok raised a significant concern about being forced to follow two conflicting sets of European rules. They argue the DSA’s data transparency demands can directly clash with the EU’s strict General Data Protection Regulation (GDPR). GDPR is a powerful law designed to protect individual user privacy.
The company is now asking regulators for clear guidance on how to share data with researchers while still robustly safeguarding user privacy. They feel caught between two important but contradictory legal obligations.

What happens if these preliminary findings become the final ruling? The European Commission could issue a formal non-compliance decision against the companies. This would not be a simple slap on the wrist for the tech giants.
It could trigger colossal fines of up to 6% of each company’s total global annual revenue. For financial giants like Meta and TikTok, that percentage translates into penalties worth many billions of dollars.

This is not the first time these companies have faced major EU penalties under new tech laws. Back in April, Meta was fined over 200 million euros for its data collection practices under a different law called the Digital Markets Act.
Earlier this year, Ireland’s Data Protection Commissioner fined TikTok about €530 million for GDPR-related failures tied to remote access and safeguards around EU user data, a separate data-protection action from the Commission’s DSA probe.
The investigation is not over yet. TikTok and Meta now get a formal chance to review all the evidence compiled against them. They can each submit a detailed written response to argue their case and defend their current practices.
The companies can also proactively propose specific measures to fix the problems identified by the Commission. A European board of digital services experts will be consulted before any final decision is made.

An EU executive vice-president summarized the case’s core principle perfectly. She stated that modern democracies fundamentally depend on trust, which platforms must actively earn. The law makes empowering users and opening systems to scrutiny a legal duty, not an optional choice.
This regulatory action is part of a much broader push to ensure powerful tech companies are held accountable for their operations. The goal is to make them answerable to their users and to society as a whole.

The EU is applying this regulatory pressure consistently across the entire tech industry. They have already issued similar preliminary findings against other major platforms like X and the shopping app Temu. No giant platform is being given a free pass.
This creates a powerful new standard for transparency that every large tech company operating in Europe must now meet. It signals the end of an era where platforms could operate as mysterious “black boxes.”

You might wonder why a European investigation matters to you in the United States. The changes these companies make to comply with strict EU law often end up affecting their global user experience and features. A simpler reporting tool created for Europe could be rolled out worldwide.
Furthermore, it sets a powerful example for regulators in other countries, including those in the U.S. It demonstrates that holding massive digital giants to account is both possible and necessary.
See how the story gets even more interesting when a former insider speaks up. Check out ex-Meta worker exposes WhatsApp flaws to get the full scoop.

This situation marks a significant shift in the power dynamic between tech giants and governments. Platforms are now being forced to open up their inner workings to independent scrutiny like never before. The age of trust us, we know best is rapidly fading away.
The outcome of this case will be a landmark moment for digital regulation worldwide. Its ripple effects will likely be felt by social media users everywhere, actively shaping a more transparent and accountable digital future for all.
Want a peek behind the curtain of these tech shifts? See what’s happening inside the industry as Meta recruits another AI leader from Apple.
Which of these platforms do you think needs the most scrutiny? Share your thoughts in the comments, and give this a thumbs-up if you found it interesting.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!