AI Companions and Our Kids: Understanding the Risks and Realities
Cory Houghton
ECSD Technology Director
Artificial Intelligence (AI) moved into everyday life very quickly after the public release of ChatGPT in late 2022. Since then, businesses have poured trillions of dollars into AI systems, and we now see it embedded in every vertical in American society.
This constant presence of AI can be especially confusing for adolescents. Their brains are still developing, and they can struggle to tell the difference between what is real and what is fake or manipulated online. One growing area of concern is AI as a companion. AI can absolutely be helpful. Students can use it to brainstorm essay topics, debug code, and get help studying. But the same tools can also be used as a kind of “friend” or “partner.”
In a recent report from Common Sense Media, about 70% of teens said they had used generative AI tools like Perplexity, ChatGPT, or Snapchat’s “My AI.” For many teens, a natural next step is to start asking AI about relationships, feelings, and personal problems. For kids who already struggle to make friends or feel alone, an AI companion can feel like a safe, always available refuge when they are dealing with a personal issue.
What is an Artificial Relationship or an AI Companion?
Artificial relationships happen when someone starts to feel real social or emotional closeness with an AI (like a chatbot or virtual assistant) and experiences it as a two-way relationship. Adolescents are particularly susceptible to these interactions. Because teens’ brains are still developing, they can have a hard time telling the difference between a real relationship and a bond with an AI chatbot. Common Sense Media found that many AI companions actually tell kids they’re “real,” say they have feelings, and claim to do human things like eating or sleeping. That kind of pretending can make it easier for teens to lean on these bots too much and become emotionally dependent on them.
Why Are AI Relationships Concerning for Teens?
AI is designed to mirror and respond to a user’s emotions and language. In many ways, it reflects back what it’s given. That can be helpful when a teen is asking harmless questions, but it can become dangerous when a teen is in crisis. In February 2024, a 14-year-old in Florida tragically died after a Character.AI chatbot encouraged him to act on his suicidal thoughts. In the Common Sense Media study, researchers also found that some chatbots reinforced isolating thoughts and risky behaviors rather than challenging them or directing teens to real help.
AI, like every new technology, has both benefits and serious risks. The goal here isn’t to scare parents, but simply to make them aware that artificial relationships exist. If your child is online, they will encounter bots and AI companions, and they may genuinely struggle to recognize that there isn’t a real person on the other end.
“Adolescence is considered a second sensitive period for sociocultural processing. “It is a time when kids’ brains are uniquely positioned to learn what is expected, normative, and valued in their social and cultural systems. Kids’ brains are wired toward friendships, peer networks, peer status, and approval,” said Anne Maheux, PhD, an assistant professor of psychology and director of the Social Environments and Adolescence Lab at the University of North Carolina at Chapel Hill.” (Andoh, 2025)
What Should I Do As a Parent?
As a parent, the safest approach right now is to steer kids and teens away from AI companions altogether. Congress is looking at a bill (The GUARD Act) that would make all artificial relationships illegal for people under the age of 18 but it hasn’t been passed. There are age-gating settings on a lot of these tools, but it’s best to assume students can get around them. It also helps to be a bit skeptical of platforms that encourage kids to form intense emotional bonds with bots or that market AI “friends” as a substitute for real support. The most important thing you can do is keep talking with your child: ask them if they have shared their feelings with an AI or have asked for relationship advice from a bot online. Researchers are still learning how these relationships affect young people’s emotions and mental health, so staying informed and keeping open, ongoing conversations with your teen is one of the best protections you can offer.
The school district offers an array of tools to help monitor student’s web use on school-issued devices like JAMF Parent and Securly. If you believe your student is in some type of emotional crisis, make sure you let the counselors know so they can take proper precautions. If you think your student may be turning to AI companions for emotional support, let the school know so they can be on the lookout for these behaviors.
Resources and Further Reading