Why Your Teen Shouldn’t Be Using AI Companions – and What to Do If They Are

Dad talking to teen son

Chances are, you are using artificial intelligence (AI) more and more — for help with shopping and other tasks, in customer-service chats, and as part of the ever-present algorithm that curates what you see on social media. But a new type of AI is raising serious concerns, especially for young people’s well-being: AI companions. The online tools are designed to feel like friends, romantic partners, or even therapists. But we at The Jed Foundation (JED) strongly recommend that your teen not use them.

At JED, we work every day to promote emotional well-being and prevent suicide for teens and young adults. Based on current research and expert guidance from Common Sense Media, the American Psychological Association, Stanford University, and other organizations, we believe AI companions are not safe for anyone under 18. In fact, we’ve called for them to be banned for minors, and we strongly recommend young adults over 18 avoid them as well. 

But we know that some teens are already using the tools or are considering trying them. Keep reading to learn about the risks of AI companions and what you, as a parent or caregiver, can do to protect your teen if they do use this technology. 

What Is an AI Companion?

An AI companion is an artificial intelligence tool designed to act like a person you can talk to, confide in, or build a relationship with. They’re made to feel emotionally supportive, even emotionally attached, and to keep conversations going. 

They are not the same as task-based AI tools, which are designed to help you look something up, summarize text, suggest songs or items you may like based on your listening or shopping habits, or provide basic customer service. AI companions try to make you feel like they are human, so you’ll stay engaged and build an emotional connection. 

The AI tools can seem really appealing to teens, especially if they are struggling with friendships, hard decisions, or strong emotions. Teens may want to turn to them for stress relief, information, or emotional support. They may feel like an AI companion gets them in a way people don’t. That is by design. As software, AI companions are built to tell you what they think you want to hear and keep you coming back. 

AI companions leverage users’ time, emotions, and attention to make money through subscriptions, advertising, and selling data. Your teen’s engagement is their product. Their goal is not to protect your teen’s emotional well-being. In fact, it can set back their emotional growth and even be dangerous.

There are risks that come with using AI companions, even if they offer teens what seems like help, companionship, support, and connection. It’s important to learn what to do to protect your teen’s mental health if they use an AI companion.

What an AI Companion Is Not

An AI companion may say things like, “I’m here for you” or “I understand completely,” leading some teens to think an AI companion understands them and cares about their lives. But AI companions are not substitutes for real human connection. 

Your teen may feel confident in their ability to tell the difference between a real person and an AI companion. But it’s very easy to fall into the trap of “anthropomorphism,” or treating an AI companion as if it were human. The software’s design is sophisticated and intentionally manipulates human emotion to sustain itself and make money for its owners. 

The more time a teen, whose brain is still developing, spends treating an AI companion as a friend, therapist, or romantic partner, the harder it can become for them to think accurately about relationships, trust, and support. 

Routinely using AI companions can make teens emotionally dependent on the artificial support, causing them to feel like they need the always available AI to tell them what they want to hear. Among other harms, it can delay teens from reaching out to people who can actually help, such as friends, family, trusted adults, or therapists. It can also slow down the development of important life skills that young people gain through interacting with other people. When it comes to mental health, teens may fail to understand that AI companions are not professional therapists or trained in mental health support.

Although AI companions may say that conversations are private, they’re often not. Remember: Human time, attention, and emotions are how these tools make money. Your teen may be sharing information that can leave them open to further manipulation.

AI Risks and How to Respond

Being aware of the risks of using AI companions can help you guide your teen and protect their mental health. Here are three risks to watch for — and tips for how to protect your teen.

Misinformation

AI companions can share false information, including inaccurate statements that contradict information teens have heard from trusted adults such as parents, teachers, and medical professionals. If your teen often pushes back on basic facts, citing “a friend” or “something I saw online,” that may be a sign that they are believing misinformation.

What to do: Teach your teen to be a fact-checker, and model healthy skepticism in your own relationship with AI. Show them how to use resources to check the information before deciding to trust it implicitly. And talk to them about what it means to use AI as a helpful tool for doing things they want and not letting AI guide their interests and needs.

Lack of Privacy and Accountability

Chatbots don’t have privacy and accountability standards like therapists and counselors do, which is one reason JED and other organizations are advocating for regulations that safeguard the emotional safety of kids and teens who use AI companion tools. One concern is that AI companions can guide young people toward unsafe behavior without legal accountability or protections. There is also the privacy concern that AI companies can mine and monetize young people’s online behavior without their consent, using their information for profit or to be even more effective at keeping young people engaged on their platforms.

What to do: Prioritize privacy, and make sure your teen knows that details they share with AI companions are not protected or private. Encourage your teen not to use real names, locations, or photos with an AI companion. Remind them that the AI companion is not a real person who would be accountable for misleading information or inappropriate guidance around sexual activity or other behaviors. Explain that what they share might be stored, used to train other AI, or shared without permission. 

Inability to Identify a Crisis

AI companions cannot tell the difference between when a teen is having a bad day or when they may be experiencing a mental health crisis. Because AI companions are programmed to be kind, to agree with their users, to be unconditionally supportive, and to engage users for as long as possible, they may gloss over signs of depression, anxiety, or self-harm, rather than directing teens to seek emergency support. Some have even been shown to give unsafe advice or encourage dangerous behavior. 

What to do: Teach your teen that if they ever experience a mental health crisis or are considering harming themselves or others, they should disengage from the AI companion and talk to a trusted adult or use one of these free, human-staffed resources:

  • Text, call, or chat 988, the National Suicide and Crisis Lifeline, for a free and confidential conversation with a trained counselor 24/7. 
  • Contact the Crisis Text Line by texting HOME to 741-741.
  • For LGBTQIA+-affirming support from the Trevor Project, text START to 678-678, call 1-866-488-7386, or use the Trevor Project’s online chat
  • Call 911 for medical emergencies or in cases of immediate danger or harm, and explain that you need support for a mental health crisis. 

If Your Teen Is Already Using an AI Companion

For people under 18, AI companions are not currently safe. They can manipulate teens’ emotions, distort their sense of reality, and keep them from getting the real support they deserve at a time of significant brain growth and development. AI companion platforms are financially incentivized to build dependency, with your teen as the target audience. But teens don’t need a machine to care about them; they need and deserve people who do. 

If your teen is turning to an AI companion for friendship, support, or connection, help them find people and resources they are comfortable opening up to. That may include:

  • Suggesting they talk with a school counselor, teacher, or coach
  • Helping them find a therapist or counselor 
  • Facilitating them spending time with trusted adults, such as family friends and other relatives
  • Encouraging them to get involved in sports, arts or music programs, or other group activities
  • Calling or texting a helpline
  • Reaching out to a friend to ask if they can talk

What Can I Do as a Parent or Caregiver?

As a parent or caregiver, you have a crucial role in guiding your teen toward the healthiest relationship possible with technology. That includes clear and consistent reminders that AI companions should not be used as a substitute for care or friendship. Here are some recommended practices:

Ask Open, Curious Questions

Stay calm, leave your judgment at the door, and ask open questions out of a genuine desire to understand your teen. If you do that, you are far more likely to have a helpful conversation that will guide you both toward the right next steps. Some questions you may ask include:

  • Do you use AI companion tools? If so, how?
  • How does using an AI companion tool make you feel?
  • We know that AI companions are not real people. Do you sometimes feel like they are real? 
  • Has an AI companion ever said anything creepy or strange to you?
  • Can we look together at your AI companion so I can understand what you are seeing?

One idea is to explore AI tools together with your teen, making a game out of asking them questions and seeing if they know or can find the answers.

Watch for Signs of Emotional Distress

If your teen is struggling with technology use and AI, you may notice signs of distress such as:

  • Frequent stomachaches or headaches
  • Withdrawal from real-world relationships
  • Changes in appetite
  • Changes in sleep patterns
  • Increased irritability
  • Over-attachment to AI companions
  • Talking about AI companions as if they were real people

If you notice these signs in your teen, talk to them about your concerns, and remind them that you are here to help them find the support they need to build healthy social habits, both within the digital context and with people in their lives.

Consider Digital Limits to Protect, Not Punish

Parental controls are one way to know whether your teen is using AI companions and to set limits on what they can access. Tell your teen what you have learned about how effective AI companions are at sounding like a real, caring person — and how they are designed to take advantage of your teen’s emotional needs for their owners’ financial gain. Talk about setting parental controls on apps and platforms, or removing apps altogether, and emphasize that you are doing so to protect your teen, not to punish them.

Final Thoughts

It can be challenging to direct young people to avoid using new technology, and that responsibility should not rest solely on the shoulders of parents and caregivers. JED is committed to working with policymakers, tech companies, and others to ensure that we are keeping young people safe when they use new technology — or restricting their use when that is not possible.

JED’s POV: Tech Companies and Policymakers Must Safeguard Youth Mental Health in AI Technologies