Was this helpful?
Thumbs UP Thumbs Down

Musk’s X takes legal action against New York’s online hate speech rules

State Senator Brad Hoylman with other in a meeting

New York’s new law targets platform transparency

In December 2024, New York passed the Stop Hiding Hate Act, requiring large social media platforms to publish moderation policies and define hate speech, extremism, and disinformation. Platforms must also submit biannual reports showing how they handle flagged content.

The law doesn’t force companies to remove speech but demands transparency about their decisions. State lawmakers said this was essential to ensure platforms take responsibility for harmful content online without infringing on free speech protections guaranteed under the First Amendment.

X(twitter) logo displayed on a phone screen

What x is challenging in court

X Corp is suing New York over the law’s requirement that platforms disclose moderation rules and submit detailed reports. The company argues this violates the First Amendment by compelling speech and interfering with editorial discretion.

In a federal complaint filed in June 2025, X claims the law pressures platforms into altering moderation practices and makes them vulnerable to public and political influence. The lawsuit targets the transparency mandate and the steep civil penalties for non-compliance.

Elon Musk, chief executive officer of Tesla Inc., speaks during the Atreju convention in Rome.

The complaint’s First Amendment basis

X’s legal filing says the Stop Hiding Hate Act forces companies to publicly justify how they manage online speech, which it views as protected editorial judgment.

The lawsuit argues that deciding what content to allow or restrict is a form of expression, and compelling disclosure about it interferes with constitutional rights. X claims this law doesn’t regulate unlawful conduct but targets speech and internal decision-making, setting a dangerous precedent for state interference in content moderation.

may 18 2025 new york usa attorney general of new

The role of New York’s attorney general

Letitia James, New York’s attorney general, is named the defendant in X Corp’s lawsuit. Her office enforces the law and ensures social media companies meet transparency standards.

While she hasn’t publicly commented in depth on the case, her office is expected to defend the law by arguing it promotes public accountability and doesn’t interfere with lawful expression. James will likely discuss that the law survives constitutional scrutiny if the case progresses.

Financial penalty words on wooden blocks against the background of a judge's gavel with a stand.

Enforcement penalties raise concerns for x

One of X’s most significant objections is the financial risk of non-compliance. Under the law, companies can be fined up to $15,000 per daily violation. These fines include missed reports, incomplete data, or failure to disclose moderation policies.

X argues these steep penalties effectively coerce platforms into self-censorship and compromise how they manage content internally. The company seeks a court injunction to block enforcement before penalties accumulate in 2025.

State Senator Brad Hoylman with other in a meeting

New York lawmakers defend the law

State Senator Brad Hoylman-Sigal and Assemblymember Grace Lee, who sponsored the law, have pushed back against X’s lawsuit. They argue the act doesn’t regulate speech but ensures transparency in how platforms enforce their rules.

In statements after the suit was filed, both lawmakers said Elon Musk’s resistance proves the need for this kind of regulation. They maintain that consumers and the public deserve to know how digital platforms treat hateful and harmful content.

Twitter X logo on a mobile screen

X’s history with content moderation

Since Elon Musk purchased Twitter in 2022, now rebranded as X, the platform has significantly changed its approach to moderation. Musk dissolved advisory boards, reinstated banned accounts, and reduced the number of staff managing trust and safety.

Watchdog groups and researchers have reported a noticeable rise in hate speech and extremist content since those changes. These shifts have triggered scrutiny from governments and regulators worldwide, leading to legal battles over content transparency and public accountability.

Lawmaker concept with a gavel

Comparison to California’s law

X previously challenged a similar law in California that required platforms to report on how they manage hate speech and misinformation. In that case, a federal judge blocked enforcement of key parts, ruling they likely violated free speech rights.

California later settled with X in February 2025, dropping the enforcement provisions. X is now using that ruling to bolster its case against New York, noting that both laws are nearly identical in language and intent.

business transparency and accountability concept businessman touches virtual interface transparency

Support from free speech advocates

Some First Amendment scholars and advocacy groups agree with parts of X’s argument. They say even well-intentioned transparency laws can risk compelling speech or dictating editorial policy.

These advocates argue that the government can’t tell private platforms how to explain or categorize content decisions. While they don’t necessarily oppose transparency, they warn against laws that may chill free expression or force companies to act as state agents when moderating user content.

Opposition from civil rights groups

Groups like the Anti-Defamation League support New York’s transparency law, arguing that it holds platforms accountable without censoring speech. They contend that hate speech and disinformation spread faster when moderation rules are unclear or poorly enforced.

These groups say platforms like X are responsible to the public and that transparency is key to understanding whether policies protect users, especially vulnerable communities. They believe X is trying to hide its inaction by invoking free speech.

Governance concept businessman pressing button on screen.

Public interest in platform policies

Transparency laws like those in New York are rooted in the belief that the public should know how social platforms manage harmful content. Critics of X say that without mandated reporting, users, researchers, and lawmakers are left guessing about the company’s policies.

Public interest groups argue that transparency promotes accountability, deters bias in moderation, and informs debate about how digital speech is handled. They stress that such laws don’t limit speech but encourage responsible platform governance.

a man and stamp hate speech

How the law defines key content

The Stop Hiding Hate Act requires platforms to define five categories in their public terms: hate speech, disinformation, harassment, extremist content, and foreign political influence. However, it doesn’t provide standard legal definitions, leaving that up to each platform.

X argues this ambiguity further complicates compliance, while lawmakers argue it allows flexibility. The law only asks companies to define and disclose their standards, regardless of how they apply or interpret them.

The timeline of court proceedings

X filed its lawsuit in the Southern District of New York in mid-June 2025. The court is expected to review motions for a preliminary injunction first, which could temporarily block enforcement. After that, the case will proceed to hearings where both sides present constitutional arguments.

If the case isn’t resolved quickly, the timeline could stretch into late 2025 or beyond, depending on appeals. Meanwhile, platforms may be left in limbo as legal uncertainties continue.

American Lawsuit

National attention on the lawsuit

Because this case involves a high-profile company like X and controversial issues around free speech, it’s receiving national attention. Legal experts, lawmakers, and tech policy advocates closely watch how courts balance platform rights with public accountability.

The ruling could shape how the government approaches online speech regulation moving forward. If the law is upheld, it might signal broader judicial support for requiring digital platforms to operate more openly about managing content.

The X Premium Plus lawsuit is gaining national attention after users spotted surprise ads, sparking outrage over what many call hidden platform changes.

A question mark 3D abstract on dark background with dots and

What happens if X wins the case

If X succeeds in court, enforcement of New York’s transparency law could be paused or permanently blocked. A ruling in X’s favor might set a precedent that limits state efforts to require similar disclosures.

It could also strengthen free speech arguments from other platforms facing similar laws in the U.S. or abroad. However, if the court rules against X, it may validate transparency mandates as constitutional, potentially inspiring other states to pass similar regulations.

If X wins this battle, Microsoft’s deal to host Grok AI on Azure could take a major turn. Check out how Microsoft Azure will soon host Elon Musk’s Grok AI model.

Do you think X’s legal win could disrupt Microsoft’s AI ambitions? Let us know what you think in the comments.

Read More From This Brand:

Don’t forget to follow us for more exclusive content right here on MSN.

If you like this story, you’ll LOVE our Free email newsletter. Join today and be the first to receive stories like these.

This slideshow was made with AI assistance and human editing.

This content is exclusive for our subscribers.

Get instant FREE access to ALL of our articles.

Was this helpful?
Thumbs UP Thumbs Down
Prev Next
Share this post

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!

Send feedback to ComputerUser



    We appreciate you taking the time to share your feedback about this page with us.

    Whether it's praise for something good, or ideas to improve something that isn't quite right, we're excited to hear from you.