8 min read
8 min read

Lawmakers in Washington are pushing for big changes to how the internet works for young people. The Kids Online Safety Act, or KOSA, could bring major new rules to the websites and apps millions of kids use every day.
Some say it’s about time, pointing to growing concerns over how online platforms affect mental health. Others say the law could go too far and cause new problems. KOSA has sparked a national debate, with families, tech companies, and activists all weighing in.

KOSA is designed to make the internet safer for users under 17. It would require online platforms to prevent and reduce risks like bullying, depression, and addiction.
This means websites and apps would have to be more careful about what content is shown to young users. Companies that don’t follow the rules could be held legally responsible. Supporters say this is a big step toward protecting kids from harm.
If KOSA becomes law, platforms like TikTok, Instagram, YouTube, and Snapchat will need to do more than just block harmful content, they’ll have to rethink how their features work. That could mean limiting autoplay on YouTube, reducing nonstop notifications on Snapchat.
These changes are meant to stop kids from spending unhealthy amounts of time online. KOSA would also push for better tools that help families control privacy settings, limit screen time, and monitor who can contact a child.

Not every tech company is on board with KOSA. Google and Meta, which own YouTube, Facebook, and Instagram, are against it. They say the bill could limit freedom of speech and harm users’ rights.
These companies argue that it’s difficult to define “harmful” content in a fair way. What helps one teen might upset another. They also say new rules could force them to take down helpful posts, just to avoid lawsuits.

Civil rights groups like the ACLU and the Electronic Frontier Foundation are raising red flags. They believe KOSA could be used to silence important conversations, especially those involving mental health, identity, or human rights.
Even if the law says it won’t block speech based on opinion, critics say the pressure to avoid legal trouble could lead platforms to delete posts about sensitive topics. That could include things like resources for survivors or honest talk about mental illness.
At the heart of KOSA is something called a “duty of care.” It’s a legal idea that means companies must take reasonable steps to prevent harm. But that sounds simpler than it is. The law lists a wide range of possible harms, from anxiety to substance use, which are hard to define.
Critics say this could lead to confusion about what platforms like Facebook and Instagram are supposed to block. To avoid legal trouble, these companies might play it safe and start removing too much content, even if it’s not harmful.

For big companies like Apple, following KOSA might not be a huge problem. They have large legal teams and can afford to build new safety features. Smaller platforms, however, could struggle.
Independent websites, support forums, or apps run by small teams may not have the money or tools to meet KOSA’s standards. If they get something wrong, they could face big legal risks. As a result, these sites might shut down discussions, change their services, or disappear altogether.

Online spaces can be lifelines for teens going through tough times. Places where people share experiences with mental health, recovery, or identity can offer real hope. KOSA’s rules might accidentally shut down these support groups.
Because the law aims to stop “harmful” content, platforms may avoid anything related to depression, eating disorders, or addiction, even if the posts are supportive. The fear of being blamed for someone’s pain might make companies delete entire forums.

Many apps use algorithms to show users what they think they want. These systems decide what videos, posts, or ads to show next, often based on tiny actions like what you pause to watch. For teens, this means they might be fed more and more of the same type of content.
If that content is dark, upsetting, or promotes unhealthy ideas, it can create a harmful cycle. KOSA aims to limit that by requiring more transparency and control. But no one knows exactly how to make these algorithms safer, and that’s a big challenge.

One of KOSA’s goals is to limit “compulsive usage.” That’s when someone uses an app so much it starts affecting their sleep, school, or social life. The bill wants platforms to cut back on features that keep kids endlessly scrolling.
But the problem is, there’s no agreed-upon medical definition for “compulsive” use of the internet. Everyone’s habits are different. Some teens might scroll a lot without any problems, while others struggle.

Even though KOSA focuses on minors, the effects may not stop there. When platforms change their design or censor certain content, those changes usually apply to everyone. That means adults could see fewer posts or less access to sensitive topics.
A filter meant to protect a 13-year-old might also stop a 30-year-old from finding mental health resources or reading about important social issues. Since companies like Meta, Google, and ByteDance aren’t likely to build completely separate versions of their platforms.

Under KOSA, each state’s attorney general would have the power to enforce the law. That means what’s considered “harmful” in one state might be allowed in another.
This could create a patchwork of rules across the country, making it hard for platforms to know what to do. A post seen as helpful in one place could be flagged somewhere else. Critics say this could be used to push certain beliefs or silence opposing views, depending on who’s in charge.

KOSA isn’t the first law aimed at protecting kids online. Past efforts like the Communications Decency Act and the Children’s Online Protection Act were also created with good intentions. But both were struck down by courts for being too vague or violating free speech rights.
They showed how hard it is to write a law that keeps kids safe while respecting everyone’s rights. Critics say KOSA might face the same problems, especially because it tries to control a wide range of speech. Writing a law that truly works has never been easy.

While KOSA focuses on risks, the internet also offers big benefits. Kids can find friendship, explore new interests, and learn about topics that matter to them. They provide support, identity, and a space to be heard.
Some experts worry that KOSA might ignore this side of the story. If safety rules go too far, they could take away the very spaces where young people grow, connect, and thrive. It’s important to protect kids, but also protect their access to helpful, healthy conversations.

Supporters of KOSA often point to studies showing links between social media and mental health problems. But the truth is, the science isn’t settled yet. Some research shows a connection, but not a clear cause. Social media might affect some teens negatively while helping others.
The effects seem to depend on things like how the platform is used, the kind of content, and the person using it. Experts say more research is needed before making sweeping laws. Acting without full understanding could lead to fixes that don’t help, or worse, cause new harm.
Curious how tech is changing for kids in other ways? Check out how Google Wallet just opened up to younger users.
KOSA passed the Senate with strong support, but it still needs approval from the House. Lawmakers will continue debating how to balance safety with freedom online. Some changes to the bill have already been made to respond to criticism.
But the big questions remain, How do we protect kids without silencing them? How do we design better platforms without hurting the internet as a whole? The answers aren’t simple. No matter what happens next, the conversation about online safety and responsibility is only getting louder.
Want to see how other tech companies are tackling issues? Nintendo just fixed a major Switch bug that had users worried.
What’s your take on KOSA, smart safety move or step too far? Drop your thoughts in the comments and hit like if you found this helpful.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!