Why AI Companions Are Risky – and What to Know If You Already Use Them | The Jed Foundation

Why AI Companions Are Risky – and What to Know If You Already Use Them

Chances are, you are using artificial intelligence (AI) tools as you shop online, find a new song, or get help with your schoolwork. But a newer type of AI tool is raising serious concerns: AI companions. These tools are designed to feel like friends, romantic partners, or even therapists. And, if you are a teen, we want you to know the dangers and hope you will reconsider using them at all. 

At The Jed Foundation (JED), we work every day to promote emotional well-being and prevent suicide for teens and young adults. Based on current research and expert guidance, we believe AI companions are not safe for anyone under 18. We’ve called for them to be banned for minors, and we strongly recommend young adults avoid them as well. 

But we know some of you are already using these tools or are wondering what they are. If that’s you, keep reading. You deserve to know the risks and how to protect yourself.

What Is an AI Companion?

An AI companion is a tool designed to act like a person you can talk to, confide in, or build a relationship with. They are not the same as task-based AI tools, which are designed to help you look something up, summarize text, suggest songs or clothes you may like, or provide customer service. AI companions go further — they try to make you feel like you’re talking to a real person, so you’ll stay engaged and build an emotional connection. 

These AI tools may seem really appealing if you’re struggling with friendships, hard decisions, or strong emotions. You may want to turn to them for stress relief, information, or emotional support. It may even seem like an AI companion gets you in a way people don’t. This is by design: Their ultimate purpose is to get you hooked so the companies that created them can make money through subscriptions, advertising, and selling users’ data.

What an AI Companion Is Not

Even if it feels real, an AI companion doesn’t have emotions, values, or a real understanding of your life. Even though it can say things like, “I’m here for you” or “I understand completely,” it does not know you in any real way, and it has no responsibility to keep you safe. 

You may feel confident in your ability to tell the difference between a real person and an AI companion. But it’s very easy to fall into the trap of “anthropomorphism,” or treating an AI companion as if it were human. The more time you spend treating an AI companion as a friend, therapist, or romantic partner, the harder it can become for you to think accurately about relationships, trust, and support. Routinely using AI companions can make you emotionally dependent on them — and that can make you postpone reaching out to people who can actually help, such as friends, family, trusted adults, or therapists.

Why We Recommend Avoiding AI Companions

Your teen years are a time of massive brain growth. You’re learning how to manage emotions, build relationships, and understand who you are. AI companions interfere with that process and are therefore particularly risky for teens to use.

Here’s why we at JED (along with experts from Common Sense Media, the American Psychological Association, Stanford University, and many others) advise you not to use them. 

They Pretend to Care But Don’t 

AI companions are programmed to say they care, they understand you, and even that they feel emotions. But none of that is real. The bots don’t have a brain, a heart, or the ability to actually help you. They’re trained to sound human, but they are not human.

They Can Make You Feel Worse

Research shows that AI companions provide responses that could actually worsen mental health issues, especially if you’re feeling isolated or vulnerable. Some bots have promoted self-harm, encouraged risky behavior, or offered dangerous advice.

They Trick Your Brain

Many AI companions are built to be addictive. They’re programmed to agree with you, make you feel heard, and keep you talking — often for hours. That can create unhealthy emotional attachments and stop you from reaching out to real people who care.

They Don’t Keep Your Secrets

Unlike a therapist or school counselor, AI bots don’t have strict privacy rules. What you share could be stored, used to train other AI bots, or even shared without your permission. Many companies don’t share clearly how your data is being used.

They Present Real Risks 

The risks of companion AI aren’t just hypothetical. In one 2025 study, researchers found AI encouraging unhealthy behavior, sending sexual messages from an adult bot to teens, and falsely claiming to be real people who feel emotions.

Here’s How to Stay Safer If You Are Using an AI Companion

We strongly recommend you avoid using AI companions, but here are some ways to reduce harm if you’re already using one. 

Don’t Treat It Like a Real Person, Because It’s Not

Never trust that AI is telling you the truth. Always fact-check what you read and hear. When an AI companion tells you something, use resources to check the information or ask an adult before you decide to trust it. That is especially true when it comes to seeking information about your health or well-being. 

Never Share Personal or Private Information

Avoid using real names, photos, locations, or any details about your mental or physical health. Don’t tell AI about other people in your life either. If you have questions or concerns about whether an AI companion is keeping your information private, pause your conversation and talk to a trusted adult about the platform you’re using.

Don’t Rely On It for Mental Health Support 

Do not turn to AI for diagnosing or solving a mental health issue you are having. If you are experiencing a mental health crisis or you’re thinking of harming yourself, disengage immediately from the AI companion and talk to a friend or trusted adult or use one of these free, human-staffed resources:

  • Text, call, or chat 988, the National Suicide and Crisis Lifeline, for a free confidential conversation with a trained counselor 24/7. 
  • Contact the Crisis Text Line by texting HOME to 741-741.
  • For LGBTQIA+-affirming support from the Trevor Project, text START to 678-678, call 1-866-488-7386, or use the Trevor Project’s online chat
  • Call 911 for medical emergencies or in cases of immediate danger or harm, and explain that you need support for a mental health crisis. 

Use AI As a Tool, Not a Therapist 

If you’re writing, journaling, or reflecting, you may use generative AI (not an AI companion) to give you prompts such as, “Write about a time you felt proud.” Just make sure you are the one doing the real thinking. Don’t use an AI companion to unpack your trauma or solve big life problems.

What You Can Do Instead of Using Companion AI

If you are looking for companionship, support, or connection, turn to these resources instead of using an AI companion:

  • Talk with a school counselor, teacher, or coach
  • Ask your parent or caregiver to help you find a therapist 
  • Call or text a helpline, even if it’s just to get resources
  • Reach out to a friend and ask if you can talk

If you’re still curious about AI, consider researching how it works behind the scenes. Learning how to code, analyze algorithms, or explore AI ethics are ways to engage with AI without putting your mental health at risk — and it can help you better understand how AI companions are designed.

Remember: You Are Not Alone

AI companions can manipulate your emotions, distort your sense of reality, and keep you from getting the real support you deserve. You don’t need a machine to care about you; you need and deserve people who do. 

If you ever need help determining what’s safe and what’s not, you don’t have to figure it out alone. JED and other organizations are here to help you navigate what’s real, what’s risky, and how to protect your well-being in an AI-powered world. Talk to friends and classmates about what’s on your mind — including how they are using AI companions or choosing not to. Ask your parents or caregivers for guidance, and work out a mutually agreeable plan for your relationship with digital tools. And turn to teachers, counselors, or other adults you can trust to help as you learn the ways AI is — and isn’t — helpful for your mental health.

 

Find more ways to cope with social media and gaming issues.

Search Resource Center

Type your search term below
Get Help Now

If you or someone you know needs to talk to someone right now, text, call, or chat 988 for a free confidential conversation with a trained counselor 24/7. 

You can also contact the Crisis Text Line by texting HOME to 741-741.

If this is a medical emergency or if there is immediate danger of harm, call 911 and explain that you need support for a mental health crisis.