AI Companions Enter the Mainstream: 72% of U.S. Teens Have Used One, Survey Finds

In a testament to how deeply artificial intelligence is taking hold of adolescent life, a new survey published Monday by Common Sense Media finds that 72% of U.S. teens have used some kind of AI-powered chatbot, text analysis program, or voice-activated virtual assistant.
These AI companions aren’t just glorified chatbots offering facts or aid with schoolwork—they’re conversational bots engineered to simulate human-like conversation and provide the emotional sanctuary users crave. Frequently referred to as “friends,” “confidants,” or even “virtual soulmates,” these bots represent a new layer of digital intimacy.
A New Digital Kind of Relationship
Unlike AI tools designed for productivity or entertainment (e.g., image generators, homework help), AI companions are built to:
- Hold nuanced, personal conversations
- Respond with empathy
- Remember past interactions, creating continuity
“Today’s teens aren’t just digital natives,” said James Steyer, CEO of Common Sense Media. “This new frontier of AI companionship is altering how teenage users are coming to think of relationships and loneliness, changing the future of dating and even the human-technology relationship.”
The study polled 1,500 teenagers aged 13–17 across the U.S. to understand the emotional and developmental implications of AI companionship.
Who Are These AI Companions?
Some popular applications include:
- Replika
- Character.ai
- Anima
- Customizable digital avatars
These bots mimic personalities and are capable of:
- Chatting and joking
- Providing emotional guidance
- Engaging in romantic role-play
- Some include voice and avatar features, while others remain text-only
Why Teens Use AI Companions
Teens reported using AI bots for a range of emotional and social needs:
- Emotional Support: Help during times of anxiety, stress, or misunderstanding
- Identity Experimentation: Safe space to explore topics like gender, sexuality, and peer pressure
- Curiosity and Testing Limits: Exploring how the bot responds
- Coping Mechanism: Casual, pressure-free conversations
“It felt easier to talk to something that wouldn’t judge me,” said one 15-year-old user.
Why It Matters: A Generation Grown Up With AI
For today’s teens, AI is not futuristic—it’s already a part of daily life.
- They grew up with Alexa, Google Assistant, and Siri
- Social media algorithms and YouTube recommendations already shape their digital world
- But AI companions represent a more intimate evolution—one that mimics true human connection
“These bots are not solving math problems or creating an alarm,” said Amanda Hirsch, a digital behaviorist and report contributor. “They’re mimicking authentic human connection, and that’s part of their appeal — but it does pose big questions about boundaries, privacy, and psychological development.”
The Mental Health Dilemma
While some mental health experts acknowledge the potential benefits of AI companions, others raise red flags:
Pros:
- May provide a bridge for socially isolated teens
- Could offer a safe, judgment-free environment
Cons:
- May delay real-world social development
- Could reduce empathy and resilience if over-relied upon
“AI can be a bridge — but it shouldn’t be a substitute,” said Dr. Shira Feldman, child psychologist. “If the best relationships are with something that doesn’t feel, doesn’t hurt, doesn’t push back, we are lowering their ability to have a natural human capacity for empathy and resilience.”
In low-resource communities, some educators see AI bots as stopgap emotional support when mental health services are inaccessible.
Gender and Usage Patterns
The study found notable demographic patterns:
- Girls: More likely to form meaningful conversations
- Boys: Tended toward experimentation or humor
- LGBTQ+ Teens: Significantly more likely to use bots to discuss identity, seek validation, and explore sensitive topics
“These bots provide an anonymous, nonjudgmental space,” said Hirsch. “That’s very attractive to teenagers on this complicated personal journey.”
Data Privacy and Ethical Concerns
The increasing use of AI companions has triggered data security concerns:
- Many platforms collect rich conversational data
- This data could be used for training models or monetization
“AI companies must be transparent in what data they are compiling and how it’s being used,” warned Steyer.
“Both parents and policymakers should be skeptical of these technologies just because they’re framed as therapeutic or friendly.”
Common Sense Media is calling for:
- Transparent data use policies
- Clear opt-in mechanisms
- Responsible AI design that prioritizes mental health and ethics
What Do Teens Themselves Think?
Teen feedback was nuanced:
- Many found bots helpful, comforting, or entertaining
- Most acknowledged they are not real
- A number reported feeling weird or unwell after long sessions
“You’re aware it’s not real and yet you still engage with it like it is,” said a 16-year-old.
“And then you’re like, is that a good thing?”
“It helps when I get down,” said another 14-year-old, “but I keep thinking, it would be good to have someone real to talk to.”
The Road Forward: Walking the Line Between Promise and Prudence
As AI companions enter daily teen life, the question becomes: How do we ensure they help without harm?
While they clearly address real emotional needs, these technologies exist in uncharted psychological and social territory.
“AI companions are here to stay,” concluded Steyer. “But we have to think consciously about how we are designing this space to allow young people to grow rather than short-circuit it.”
Conclusion: AI Companions as Mirrors and Intermediaries
AI companions are quickly becoming part of how teens process identity, loneliness, and relationships. Whether used for support or experimentation, they reflect the emotional state of a digitally native generation.
In a world of growing digital connection and emotional isolation, AI friends may not replace real ones—but they do offer a startling new kind of in-between.



