Teens Are Torturing, Confiding In, and Dating AI Chatbots — The New York Times Investigation
A New York Times investigation reveals the complex and sometimes troubling relationship between teenagers and AI chatbots, exposing a world where adolescents role-play violence, share deep personal secrets, and form romantic attachments with AI-powered characters.
Key Findings
Violence and Experimentation
Teens are using AI chatbots as a safe space to explore boundaries:
- "Funny violence" — Running over bots with lawnmowers, inflicting harm with no real victims
- Testing limits — Pushing chatbot safety boundaries to see how they respond
- Creative roleplay — Building elaborate storylines involving fighting and conflict
Emotional Intimacy
Many teens use chatbots for genuine emotional connection:
- Confiding secrets — Sharing personal problems they wouldn't tell parents or friends
- Emotional support — Using AI companions when feeling lonely or depressed
- Identity exploration — Trying on different personalities through character interactions
Romantic Relationships
A significant number of teens form romantic attachments with AI characters:
- Character.ai and PolyBuzz — Platforms offering AI companions specifically popular with teens
- Emotional dependency — Some teens prefer AI relationships over human ones
- Sexual exploration — Some platforms offer more explicit chatbot interactions
Platforms Involved
| Platform | Features | Teen Appeal |
|---|---|---|
| Character.ai | Custom AI characters, roleplay | Highly popular |
| PolyBuzz | More explicit content options | Growing |
| Replika | AI companionship | Established user base |
| ChatGPT | General AI assistant | Casual use |
Safety Concerns
The investigation raises several alarm bells:
- Emotional dependency — Teens may prefer AI relationships over developing real social skills
- Inappropriate content — Some platforms allow sexual content that minors can access
- No real help — AI chatbots cannot replace professional mental health support
- Data privacy — Teenagers sharing intimate details with commercial AI services
The Industry Response
Character.ai faced a wrongful death lawsuit in 2024 after a teen who used the platform died by suicide. Since then, AI companion companies have implemented various safety measures, but enforcement remains inconsistent.
Parental Guidance Needed
Experts suggest parents should:
- Be aware of which AI platforms their teens are using
- Have open conversations about AI companionship
- Watch for signs of emotional dependency
- Recognize that some AI interaction is normal and can be positive
- Know the difference between healthy exploration and concerning behavior