7 min read
7 min read
Social media can be fun but comes with risks, especially for teens. Meta, the company behind Facebook, Instagram, and Messenger, is now taking steps to make these spaces safer.
Teens using these platforms will now be placed into special accounts with built-in protections. These accounts are designed to limit contact with strangers, filter out harmful content, and encourage healthier screen habits.
Teen safety tools started on Instagram but are now on Facebook and Messenger. That means teens across more apps will automatically get the same protections, without needing to adjust anything.
This expansion starts in the U.S., U.K., Canada, and Australia, with more countries to follow soon. These updates reflect the importance of keeping teens safe across all major platforms, not just one.

One of the biggest risks of social media is unwanted messages. Meta is now ensuring that teens only receive DMs from people they follow or have talked to.
That means random strangers can’t just pop into a teen’s inbox anymore. This change helps block unwanted attention, especially from adults or bots. It also gives teens more control over who they hear from online.

Instagram and Facebook Stories are popular with teens, but they’re also being updated with safety in mind. Now, only a teen’s friends can reply to their Stories.
That helps prevent random people from responding or trying to start conversations based on short, personal posts. Stories are meant to be fun and casual, not a way for strangers to gain access.

Before these changes, anyone could tag or mention a teen in a post, even if they didn’t know them. That’s no longer allowed unless the person is a friend or the teen follows.
This keeps teens from being pulled into public posts or conversations with strangers. It’s a simple change, but it makes a big difference in helping teens stay private. Teens can now enjoy social media without being randomly exposed to people who might try to reach them in sneaky or indirect ways.

Meta is adding something helpful to support better screen habits, such as gentle reminders to log off. Teens will get a prompt to take a break after using the app for an hour daily.
This isn’t a forced log-out, it’s just a reminder that there’s more to do outside the screen. These nudges can help teens become more aware of their usage and make healthier choices.

Teens will now have “Quiet Mode” turned on overnight. This feature limits notifications, so they’re not tempted to check their phones late at night.
It helps protect sleep time and gives teens a clearer boundary between online life and rest. Quiet Mode doesn’t block the app entirely, making it less noisy and distracting. With school, sports, and everything else going on, teens need sleep more than another scroll session.

Live streaming can be fun, but it opens the door to unpredictable situations. That’s why Meta is changing the rules; teens under 16 now need a parent’s permission to go live on Instagram.
This allows families to discuss what’s appropriate to share in real time. It also prevents teens from facing pressure to perform or respond on the spot. Live content is hard to monitor; once it’s out there, it’s out there.

Meta’s apps now automatically blur images in DMs if they detect possible nudity. Teens can’t turn off this feature unless their parent approves.
This helps prevent young users from being exposed to harmful or shocking content in private messages. It adds a barrier that protects them from unwanted images while letting them use the chat feature.

Meta has created a Family Center in each app where parents can manage settings and supervise their teens’ experience. It’s not about spying, it’s about staying involved.
Parents can approve setting changes, monitor screen time, and learn more about how their teen uses the platform. It’s designed to start conversations, not conflicts.

Even though teens can ask to change some settings with a parent’s permission, most choose not to. Meta says 97% of teens aged 13–15 keep their safety features turned on.
This shows that many teens appreciate the added protection and don’t feel the need to turn it off. They might not say it out loud, but clearly, they’re comfortable with the default settings.

Meta reports that more than 54 million teen users have already been moved into Teen Accounts. That’s a huge number, and it’s still growing.
As more countries access the feature, even more teens will benefit from safer and more controlled app experiences. Meta wants this to be the standard for how younger users interact online. Instead of searching for privacy settings, it starts in the safest mode by default.

Meta worked with Ipsos to survey parents, and the results were clear: most parents liked the changes. 94% said the Teen Accounts are helpful, and 85% said they make it easier to support their kids online.
That kind of feedback matters. It shows that the tools aren’t just built for teens, but for families, too. When parents feel included, guiding safe behavior and building trust around tech use is easier.

Previously, social media sometimes gave too much access to too many people. Teen Accounts flips that approach, putting privacy first.
Instead of teens needing to lock everything down themselves, the safest options are the starting point. That means fewer chances of mistakes, fewer surprises, and more peace of mind for both teens and parents. It’s a big change in how these platforms treat young users.

Meta didn’t make these changes out of nowhere. They’ve been under a lot of pressure from lawmakers, parents, and researchers about the harms of social media on teens.
There have been lawsuits, public hearings, and even calls to label social media like cigarettes. So while the updates are helpful, they’re also a response to growing concerns. It’s about showing they’re willing to act, in part, before new rules are forced on them.
Curious how Meta’s platforms are holding up under all this pressure? Take a look at what happened during their recent major outage.

Meta says this is just the beginning. More updates are already underway to continue improving teen safety across its platforms.
As tech keeps evolving, the tools to protect younger users must also keep up. Future changes might include even better parental controls, smarter content filters, or new screen time management methods.
Want to see how Meta’s handling privacy concerns elsewhere? Check out why they agreed to stop tracking a user in the UK.
What do you think about Meta’s new teen safety features? If you support safer social media, drop your thoughts in the comments and hit that like button.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!