6 min read
6 min read

Harvard Kennedy School professor Alex Green isn’t buying the nonstop hype around AI in classrooms. He told Business Insider this ‘AI evangelism’ is actually putting students at risk, especially when it comes to their career prospects.
Green says overreliance on AI is stripping away vital skills and weakening the student–teacher connection that’s at the heart of real learning.

One student told Green they spent weeks writing a long research paper, only to get a mediocre grade and vague comments. The professor had used ChatGPT to grade it.
To Green, that was proof of how AI can undercut education, leaving students frustrated and unsure whether their work is even valued.

Green believes that heavy AI use is causing students to lose core abilities, such as reasoning, knowledge, and communication. He warns that students who lean on chatbots to write or think for them are skipping the struggle that actually builds those muscles.
Later on, this could cause problems. Without those skills, future employers may hesitate to trust their abilities.

According to Green, employers already know the risks. Some anecdotal reports suggest that employers are exploring ways, such as proctoring tools or screen monitoring, to ensure applicants are not relying on AI during assessments.
That means students who relied too heavily on tools in school may struggle in the real world when those shortcuts aren’t available.

As a communications professor, Green sees part of his job as preparing students for careers. That means teaching them to write clearly, think critically, and synthesize ideas on their own.
“My job, in part, is to help prepare them to go get jobs,” Green told Business Insider. If he encouraged blind AI use, he says, he’d be doing his students a real disservice by robbing them of skills employers still value.

Green makes it clear he’s not against technology. He admits to using AI himself for research and even allows it in class at certain points.
But he insists it should be handled carefully, not as a replacement for teaching. In his words, “heavily relying on it is a waste of a school’s resources.”

If schools want to bring AI into classrooms, Green argues, they can’t just hand out the tools and hope for the best. Teachers need proper training to understand both the benefits and the risks.
Without that, faculty could misuse tools, grade with bots, or outsource feedback. He believes guardrails are necessary so AI doesn’t weaken learning, but rather improve it.

Green tells his students that he’s “a total nerd” about his subject and has dedicated years to it. He asks: Why throw away that dedication to a machine?
To him, the value of a teacher goes far beyond just giving answers. It’s the years of expertise, the ability to explain in different ways, and the one-on-one connection that help students grow. Quick AI responses might be convenient, but Green argues they can’t replace the teacher.

Green warns against what he calls the “Bible salesman version of AI”, people selling the technology like it’s a miracle cure for education. He says this pitch makes AI sound like it can solve every classroom problem with ease.
He says students could end up with “junk” learning, missing the deep skills they’ll need to thrive outside of school. AI can be helpful, but it’s no shortcut to the deep skills that only come from effort and real teaching.

Green argues that too many AI-based education projects treat learning as something that should always be easy. But he says real growth comes from struggle.
Wrestling with tough concepts and working through challenges is how students build lasting knowledge, not by clicking a button for instant answers.

Green points to examples where tech-focused education didn’t deliver. One well-known case was AltSchool, a Silicon Valley project launched in 2013 with major funding.
Despite the hype, it shut down within a few years. For him, that shows the danger of chasing flashy AI solutions without thinking through long-term effects.

It’s not just schools. Big institutions are jumping in, too. California State University recently pledged to become the first major AI-powered university system.
Even the federal government under Trump is pushing for more AI in K-12 classrooms. Green sees this wave of enthusiasm as risky if the groundwork and training aren’t there.

Green isn’t alone in warning about moving too fast. South Korea rolled back plans to use AI textbooks after backlash from parents and teachers.
Education isn’t just about handing over information; it’s about shaping skills, habits, and confidence. Without the right planning and safeguards, schools risk creating more problems than they solve.

Despite his concerns, Green isn’t against AI altogether. He sees real value when it’s used in thoughtful, limited ways.
In his own classroom, he introduces it after weeks of traditional teaching, showing students how to spot fake images or deal with chatbots in political communications. The key, he says, is balance.

Green suggests schools put clear limits on how teachers use AI, especially banning it from grading or giving feedback.
Without such rules, he fears education could erode into surface-level shortcuts, leaving students with impressive tools but weak skills.
Curious about the hidden risks AI could pose beyond the classroom? Sam Altman flags privacy risks in ChatGPT therapy.

Green’s message is simple: AI isn’t the enemy, but treating it like a cure-all is. Students still need to learn how to think, write, and reason on their own.
If schools ignore that, the careers and futures of young people could be at stake.
Discover why Sam Altman predicts AI will lead to massive job loss and industry shakeups.
What do you think about this? Let us know in the comments, and don’t forget to leave a like.
Read More From This Brand:
Don’t forget to follow us for more exclusive content right here on MSN.
This slideshow was made with AI assistance and human editing.
This content is exclusive for our subscribers.
Get instant FREE access to ALL of our articles.
Dan Mitchell has been in the computer industry for more than 25 years, getting started with computers at age 7 on an Apple II.
We appreciate you taking the time to share your feedback about this page with us.
Whether it's praise for something good, or ideas to improve something that
isn't quite right, we're excited to hear from you.
Stay up to date on all the latest tech, computing and smarter living. 100% FREE
Unsubscribe at any time. We hate spam too, don't worry.

Lucky you! This thread is empty,
which means you've got dibs on the first comment.
Go for it!