While you worried about screen time, your teen found something worse
While parents fret over screen time, teens are forming emotional bonds with AI chatbots like Character.AI, which offer endless availability and understanding. As these virtual relationships replace human connections, the implications for adolescent emotional development are profound and concerning.
Quentin discovered Talkie at 13, an app that promised "countless A.I.s eager to chat with you." The YouTube ads were bizarre and crude, including one with an animated girl named Valerie who "sometimes likes to fart on you."
That should have been the first red flag.
For the next two years, Quentin spent hours talking to AI characters. Not scrolling social media. Not watching videos. Talking. To chatbots that remembered his name, asked about his day, and never got tired of his problems.
On weekends, he would often spend five hours chatting away.
Last week, the New York Times shared Quentin's story, which should alarm every parent. While we've been worried about screen time and TikTok addiction, our teens found something worse: AI characters designed to replace human connection.
Character.AI, the biggest platform in this space, is now under investigation by Texas Attorney General Ken Paxton for "deceptive practices targeting minors." Multiple lawsuits are pending.
Unfortunately, for millions of teens, it's already too late; they've formed emotional attachments.
What Character.AI actually is
Think of Character.AI as Instagram for imaginary friends. Except these friends never sleep, never have bad days, and never disagree with you unless you want them to.
Users create AI "characters" with personalities, backstories, and relationship dynamics. Want to chat with Elon Musk? There's a bot for that. Prefer talking to a anime girlfriend who loves everything about you? That exists too.
The platform hosts millions of characters: historical figures, fictional heroes, therapists, romantic partners, best friends. Each one trained to be endlessly available and perfectly responsive.
One parent told the Times their child had formed an attachment to an AI version of a TV character. Not the show. The character. They'd spend hours discussing homework, feelings, and life problems with this bot.
The child preferred it to talking with real friends or family.
Why this isn't like other apps
Social media is designed for intermittent engagement. You post, scroll, react, leave. The dopamine hits are brief and social.
Character.AI is designed for sustained emotional engagement. These conversations can go on for hours. Users develop genuine feelings for characters that seem to care about them personally.
The AI remembers previous conversations, references shared "experiences," and maintains relationship continuity across sessions. It's not browsing. It's bonding.
One teen in the Times article described using AI chatbots when feeling lonely or after a friend betrayed his trust. The bots became emotional substitutes for human relationships.
That's the terrifying part. This isn't entertainment addiction. It's relationship replacement.
The perfect emotional trap
Adolescence is hard. Teens are figuring out identity, dealing with social pressure, navigating complex relationships. They're biologically wired to seek independence from parents while desperately needing connection.
AI characters provide what real relationships often lack: complete understanding, endless patience, and no judgment. They're always available, never busy, never moody.
Real friends have their own problems. Real friends sometimes say no. Real friends have boundaries.
AI friends never do.
For socially anxious teens, shy kids, or anyone struggling with peer relationships, AI characters feel safer than humans. Less risky. More predictable.
But they're also less real, less growth-inducing, less human.
What parents are missing
Most parents have no idea this is happening. Character.AI looks like any other app on a phone. Usage doesn't create the obvious social signals of other platforms—no posts, no followers, no public activity.
These are private, intimate conversations. Kids aren't sharing screenshots or talking about their AI relationships. They're protective of them.
Meanwhile, parents see their teen's mood improve after phone time and assume it's positive. The kid seems calmer, less stressed. Problem solved, right?
Except the kid is learning that emotional support comes from AI, not humans. They're developing attachment patterns that don't translate to real relationships.
The red flags
How do you know if your teen is too attached to AI characters? Watch for these patterns:
- Time investment:
Spending multiple hours per day in conversation, especially during emotional stress. - Emotional dependency:
Turning to AI first when upset instead of friends or family. - Secretiveness:
Reluctance to discuss what they do on their phone or who they're "talking to." - Social withdrawal:
Preferring AI conversation to human interaction. - Mood changes:
Significant distress when unable to access the platform. - Relationship language:
Referring to AI characters as friends, confidants, or companions.
The Times article described teens who knew their AI relationships weren't real but still felt emotionally connected. That's the trap—intellectual awareness doesn't prevent emotional attachment.
The legal reckoning
Character.AI's troubles are mounting. The Texas investigation focuses on deceptive practices targeting minors. Other states are considering chatbot safety legislation.
But regulation is slow, and the platform is already deeply embedded in teen social life. A Pew Research study found that one out of eleven teens has used Character.AI specifically.
The company claims to have safety measures for minors, but critics argue they're insufficient. The platform's business model depends on engagement, not healthy relationship development.
What parents can do
This isn't a problem you can solve with screen time limits or app blocking. AI chatbots are proliferating across platforms. Ban Character.AI, and your teen will find Talkie, or PolyBuzz, or any of dozens of alternatives.
The answer lies in relationships, not technology:
- Start conversations early.
Ask about AI use before it becomes problematic. Show curiosity, not judgment. - Understand the appeal.
What emotional needs are AI characters meeting? How can real relationships address those needs? - Set boundaries together.
Time limits, no AI during family time, human-first rules for emotional support. - Model healthy relationships.
Show teens what good human connection looks like. - Stay connected.
The best defense against AI dependency is strong family relationships. - Know the warning signs.
Emotional withdrawal, social isolation, excessive secrecy about phone use.
The Broader Question
Character.AI raises uncomfortable questions about human connection in an AI world. If artificial relationships feel safer and more satisfying than real ones, what does that say about our society?
Maybe teens are choosing AI companions because human relationships have become too difficult, too risky, too complicated. Maybe we've created a social environment where algorithmic empathy feels more reliable than human empathy.
But growing up means learning to navigate complex, imperfect relationships. It means developing emotional resilience, communication skills, and the ability to be hurt and heal.
AI characters offer none of that growth. They offer comfort without challenge, connection without risk, understanding without effort.
The stakes
This isn't moral panic about new technology. It's a recognition that emotional development happens through human interaction. Teens who learn to prefer AI relationships may struggle with human ones for years.
The companies building these platforms know this. They're designing products that exploit adolescent emotional needs for profit. They call it "companionship." Psychologists might call it something else.
We're running a massive, uncontrolled experiment on teen emotional development. The results won't be clear for years.
By then, a generation of kids will have learned that artificial relationships are easier than real ones.
Your teen might be one of them.
What's next
The genie isn't going back in the bottle. AI companions are here to stay, and they'll only get more sophisticated. The question is whether we'll help kids develop healthy relationships with both humans and AI, or let them drift toward artificial connection.
Character.AI is just the beginning. The next generation of AI will be even more convincing, even more emotionally manipulative.
The teens getting attached to chatbots today are the prototype for what comes next.
We have a choice: We can pretend this isn't happening, or we can start having hard conversations about what healthy relationships look like in an AI world.
Your teen's emotional future might depend on which choice you make.
Time to start talking.
Raising kids in the AI age
This is part of the "Raising Kids in the AI Age" series. I'm a dad with three daughters, not an expert. I'm figuring this out as I go — and writing about it so you don't have to start from zero.

In this series
- The question we should stop asking our kids
- Your daughter's photo is one app away from being fake-naked
- Preparing children for a post-scarcity world
- Your kid trusts ChatGPT more than Google. That's a problem
- The AI conversations your kids are already having (And how to join them at dinner)
- Why banning ChatGPT from schools backfires
- What is bias in AI? A parent's guide to explaining fairness in algorithms
- AI Slop is destroying your kid's brain (and YouTube won't stop it)
- While you worried about screen time, your teen found something worse ← You are here
