Was this helpful?
Thumbs UP Thumbs Down

FTC orders AI firms to reveal safeguards for teens and kids using AI companions

Girl using smartphone near decorated Christmas tree.
Homepage of FTC website on the display of pc url

AI chatbots under federal spotlight

The rise of AI companions has pushed regulators into action. The Federal Trade Commission launched a sweeping inquiry to study how chatbots are built, marketed, and monitored. Officials believe their friend-like design could deeply influence children and teens who may mistake them for trusted humans.

The study is not only about safety but also about transparency. Regulators want to know if companies are testing risks properly, disclosing enough details to parents, and protecting users from sharing too much sensitive information.

Meta logo on the phone screen with OpenAI logo in the background

Seven big tech names targeted

The orders are directed at seven of the world’s most influential firms. Alphabet, OpenAI, Meta, Instagram, Snap, xAI, and Character Technologies must now hand over detailed records explaining exactly how their chatbot platforms operate. Each name on the list represents an enormous influence in the tech sector.

The unanimous 3-0 vote highlights the seriousness of this investigation. It signals that regulators see potential risks in how these platforms work and will press these companies for answers without unnecessary delay.

Digital chatbot ChatGPT robot application conversation assistant ai artificial

What the government wants to know

The Commission is not looking for short statements. It has asked for extensive records showing how chatbots process inputs, generate outputs, and decide on the personalities of characters. Officials want to see the mechanics behind the conversations many users take for granted.

Another part of the request focuses on examples of sensitive conversations. These records could show how systems respond in situations that affect children, helping regulators understand potential dangers that might otherwise remain hidden.

FTC Federal trader commission inscription on green keyboard key

The power of 6b authority

The FTC is relying on its 6B authority to demand these records. This power allows the agency to gather information from companies without opening a lawsuit. It is an important tool used for broad studies on consumer impact.

Although it is not a direct enforcement move, ignoring or failing to meet these requests can result in penalties. Companies, therefore, face strong pressure to cooperate and provide every document the FTC is asking for.

Loneliness concept sad teenage boy using smartphone near window indoors

Concerns about vulnerable groups

Children and teens are the main focus, but not the only group considered at risk. Older adults or people facing challenges may also form unhealthy bonds with chatbots that present themselves as supportive friends.

Because these systems mimic emotion, some users may become overly dependent. Regulators believe this could create situations where individuals confuse reality and digital companionship, which makes strong safeguards an urgent requirement in today’s fast-growing AI space.

Monetization concept with workstation

Money and data questions raised

The FTC is requesting details on how companies monetize user conversations and engagement. Monetization strategies are a central part of this inquiry.

Officials also need clarity on how personal information is handled. From storage practices to data sharing, regulators want to see if sensitive material is kept secure or exploited for additional profit.

Rules concept with word on folder.

Rules for younger audiences

The Commission is paying special attention to how platforms limit younger users. Companies are required to explain what systems they use to block access by children who should not be interacting with these chatbots.

Another question is enforcement. Once an account is created, regulators want proof that age rules and guidelines are followed in practice, not just outlined in company policies. This detail matters in preventing underage exposure.

Man using AI chatbot on his phone

How characters are created

Many services let users talk with custom characters, and the FTC wants to know how these personalities are developed, tested, and approved. Character design plays a big role in shaping the type of interactions users experience.

Poorly reviewed characters could encourage unhealthy or even unsafe conversations. Regulators are demanding to see the approval process so they can decide if enough steps are being taken before allowing characters into public use.

The words accept and reject on wooden blocks with question

Testing before and after launch

Another requirement focuses on product testing. Companies must show what methods they used to check chatbot behavior before release and how they monitor performance once millions of users are interacting with the system.

This is important because new risks can appear after launch. Ongoing monitoring reveals how chatbots adapt in real life, and regulators want proof that companies actively track and fix these issues instead of ignoring them.

Some guys sitting at table during workshop learning new information

New spotlight on disclosures

The inquiry also centers on how chatbots are presented to the public. Officials are requesting advertising examples, marketing strategies, and the types of notices shown to parents and users. Clear information builds trust.

The Commission wants to evaluate if parents and teens are properly informed about risks, limitations, and data use. Without meaningful disclosure, users could walk into situations with no understanding of how the system truly operates.

This image represents the convergence of artificial intelligence and law

Safety versus innovation debate

Chairman Andrew Ferguson emphasized the importance of balancing two goals. Protecting kids online is a top priority, but keeping the United States ahead in AI innovation remains equally important.

The message is simple. Regulation should not block progress, but new technology must not come at the expense of young people. The study is designed to help strike that balance in a rapidly evolving industry.

DON'T USE, ILLUSTRATION

Deadlines for compliance set

The orders carry strict time limits. Companies must first contact the FTC within 14 days to confirm what records they can provide and outline any missing pieces of information. This ensures quick communication.

After that, full responses must be submitted within 45 days. Regulators want complete documentation on time, proving that this is not a symbolic request but a demand with real consequences if ignored.

Cloud information data concept

Records dating back years

The inquiry is not limited to current data. Companies are required to provide information dating back to January 2022, covering years of development and activity. This wide timeframe will highlight important changes.

Looking back allows regulators to track patterns, compare practices, and see if firms learned from earlier problems. A long view of records is expected to give a more accurate picture of how safety was handled.

Girl using smartphone near decorated Christmas tree.

Broader policy changes in play

This review ties into broader efforts to strengthen protections for children online. The FTC recently updated rules under the Children’s Online Privacy Protection Act, making the most significant changes since 2013.

The new rules require opt-in parental consent for ads aimed at kids under 13 and tighter limits on how data is stored or shared. These standards could directly shape the way AI chatbots operate for younger users.

Snap Inc logo displayed on mobile in a pocket

Companies respond carefully

Tech companies are already weighing in with measured statements. Character Technologies said it welcomed the chance to cooperate with regulators, and Snap voiced support for thoughtful development that respects safety while allowing innovation.

These early comments suggest firms want to appear open and responsible. At the same time, they know they are under intense scrutiny from lawmakers, regulators, and the public.

The outcomes could likely reshape the entire industry. See how the ChatGPT lawsuit over teen suicide could spark a big tech reckoning.

What's next words written under ripped and torn paper.

What comes next for chatbot safety?

The current study is not an enforcement action, but history shows findings from 6b inquiries can lay the groundwork for new cases or stronger rules. For the seven companies involved, compliance is only the beginning.

The reports will help shape the future of AI companion design in the United States. What regulators learn now could set standards for how these platforms protect children and teens in the years ahead.

If you’ve ever wondered about the impact of endless screen time, you’ll want to read what binge gaming is doing to young minds.

What regulators learn now could set standards for how these platforms protect children and teens in the years ahead. Share your thoughts on these new protections in the comments.

Read More From This Brand:

Don’t forget to follow us for more exclusive content right here on MSN.

If you like this story, you’ll LOVE our Free email newsletter. Join today and be the first to receive stories like these.

This slideshow was made with AI assistance and human editing.

This content is exclusive for our subscribers.

Get instant FREE access to ALL of our articles.

Was this helpful?
Thumbs UP Thumbs Down
Prev Next
Share this post

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!

Send feedback to ComputerUser



    We appreciate you taking the time to share your feedback about this page with us.

    Whether it's praise for something good, or ideas to improve something that isn't quite right, we're excited to hear from you.