8 min read
8 min read

Many teens say they can’t make decisions without first asking ChatGPT. From school problems to friend drama, they rely on the chatbot for guidance. Some treat it like a trusted voice, not just a tool on their screen.
Sam Altman, the person behind ChatGPT, admitted that this kind of emotional dependence feels off. He’s concerned about young people handing control of their lives to a machine, especially when they start treating it as smarter or more understanding than real people.

Some teens have begun to feel like ChatGPT understands them more than anyone else. They share personal thoughts with it, saying the AI knows all about their emotions and relationships. This connection can feel deep and personal, even though it’s digital.
Altman described this over‑reliance as ‘bad and dangerous,’ warning that some teens say, ‘I can’t make any decision in my life without telling ChatGPT everything… It knows me, it knows my friends… I’m gonna do whatever it says,’ adding that this feeling of deferring life decisions to a machine is deeply troubling.

College students often set up ChatGPT as a tool they can control and automate. Some use it to manage daily life, from assignments to emotional support, by storing prompts and integrating files to get tailored responses every time.
Altman pointed out how students build complex workflows around the chatbot. It’s no longer just about answers, but using ChatGPT as a digital assistant that’s always on call. This level of dependence, he said, may lead to letting AI drive big life choices.

In a major survey, kids aged 13 to 14 were more likely to trust AI advice than older teens. Nearly one-third said they trusted it quite a bit or completely. That kind of trust so early is surprising to experts.
These younger users seem more open to forming strong bonds with chatbots. As they grow up using AI tools regularly, they may begin to view technology not just as useful but as emotionally dependable, which changes how they learn to trust others, too.

Sam Altman said it feels wrong when people let AI guide their daily life decisions. He admitted that it’s tough to watch so many young users put blind trust in something they don’t really understand beyond the surface.
He said it’s not about the quality of advice but about how AI is replacing the natural decision-making process. Even good guidance from a chatbot becomes dangerous if people stop thinking for themselves or questioning what they’re being told.

Altman shared that older adults typically treat ChatGPT like a fact-finder. They use it like a search engine, asking quick questions and moving on. For them, it’s a practical tool rather than something they feel close to emotionally.
This contrast shows how younger generations connect more deeply with technology. Instead of asking for facts, they turn to ChatGPT for insight, support, and guidance. That change in purpose can shape the way they see themselves and the world around them.

A study showed that 52 percent of teens interact with AI at least a few times each month. That means millions of teens are growing up with chatbots playing a regular part in their routines, even during personal moments.
It’s no longer something rare or experimental. These interactions are becoming a steady part of how they think, plan, and respond to the world. Altman said that with this level of interaction, emotional attachment, and overuse are becoming serious concerns.

Sam Altman said he doesn’t always feel comfortable using some AI features. He’s cautious about putting personal information into tools like ChatGPT, even though he helped build them and knows how they work better than most people do.
He shared this concern on a podcast, saying he doesn’t know exactly who might access that data later. His hesitation shows that even those at the top of AI development recognize the risks of treating the technology too casually or personally.

Instead of speaking with friends or adults, some teens talk to ChatGPT about their emotions. They ask for help with stress, sadness, or loneliness. It feels easier, private, and always available, unlike a human listener who may not understand.
This habit may seem harmless, but it could affect emotional growth. Experts say relying on AI for support may prevent teens from learning how to share feelings, face awkward talks, or build deep bonds in real life. Altman finds this trend alarming.

Some teens ask ChatGPT to guide them through breakups, school fights, or family struggles. They may feel comforted by the quick advice and steady tone, but experts say that doesn’t replace human understanding or emotional growth.
Altman pointed out that good advice still isn’t enough. The issue is not quality but connection. AI can’t truly relate to human feelings, and using it to solve personal problems can slowly weaken natural decision-making and emotional development over time.

The more teens use ChatGPT, the more they begin to trust it. After enough chats, it stops feeling like software and starts feeling like a friend who knows their habits and moods, which changes how they see the tool.
Over time, this trust can grow without users realizing it. They begin to take advice more seriously, even when it shouldn’t hold so much weight. That kind of automatic trust can blur the line between helpful guidance and harmful dependence.

Teenage years are full of choices that shape identity. When AI plays a major role in those decisions, it influences who a person becomes. ChatGPT can subtly guide opinions, values, and beliefs without young users even noticing.
Altman warned that using AI too closely while growing up may change how people form independent thoughts. Instead of asking themselves what they believe or feel, they might start relying on a chatbot to form those thoughts for them.

One of Altman’s strongest warnings is about people letting go of their own thinking. He fears a future where important choices are made by machines people trust too easily or understand too little, even in high-stakes situations.
He said it’s not that ChatGPT gives bad advice, but that people stop questioning it. If people choose to follow AI without thinking, they may lose the skills needed to make careful decisions based on real-world understanding and experience.

To get better responses, some teens memorize and reuse complicated prompts. They keep them in notes or documents and paste them into ChatGPT regularly, building a habit that makes AI feel like part of their daily structure.
Altman explained that this behavior turns ChatGPT into something more than a chatbot. It becomes a digital system for managing life, which can push people into a pattern of needing it for things they once handled on their own.

As reliance on AI grows, some schools have started teaching students how to use it responsibly. These lessons focus on using AI as a helper while still making space for human judgment and independent thinking.
Teachers want students to understand that AI doesn’t replace learning or emotional skills. It’s a tool that can assist, but shouldn’t lead. By giving students balance early, educators hope to prevent emotional dependence before it becomes a long-term habit.
That effort to find balance feels even more important now, especially as AI scientists are concerned they can’t fully grasp their own systems.

Sam Altman says people need to create clear limits around how they use ChatGPT. Even though the technology is powerful and helpful, he believes it must be used with strong awareness and caution.
He doesn’t think people should stop using AI completely. Instead, he encourages healthy habits. Setting boundaries can help users stay in control, make smart decisions, and avoid falling into patterns where they forget how to think clearly on their own.
That’s just one part of the bigger plan, especially with how Sam Altman wants ChatGPT to remember your whole life.
Have you ever felt too dependent on AI tools? Share your experience in the comments, and let’s talk about it.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!