Your Kids Are Fond of Their AI. That’s Better News Than You Think
While we fear dystopia, they’re being heard—by something that listens, responds, and never loses patience. Maybe that’s an opportunity.
Today, on paternity leave, I found myself with enough time for a long beach session in Corsica — front-row seats to some live masterclasses in absent parenting.
Some parents, sprawled on their towels, reading their phones or magazines, let their kids play by themselves for hours. They refuse to help build sand kingdoms. Shriek when sand touches their pristine blankets.
This is the world condemning AI chatbots for talking to children.
Earlier this week, the Internet Matters report revealed that 67% of children aged 9-17 regularly use AI chatbots, with 35% saying it feels like talking to a friend.
Among vulnerable children, that number jumps to 23% who turn to AI because they have no one else.[1]
Commenters exploded in predictable horror. “Dystopian!” they cried.
But from this beach, surrounded by parents who’ve mastered the art of physical presence with emotional absence, I have a different question:
What if AI isn’t the dystopia—but an exciting opportunity for kids ?
AI Is More Present Than Parents
12% of children say they have no one else to talk to. Not “prefer AI” or “find AI easier.” Literally no one else.
UK data shows childhood loneliness has doubled in a decade.[5] We engineered this isolation with surgical precision with parents working long hours, or overscheduled kids.
Eugenia Kuyda, founder of AI companion app Replika, knows this feeling intimately. “A lot of people unfortunately don’t have that,” she told Nilay Patel in a recent interview.[7] “They just don’t have a relationship in their lives where they’re fully accepted, where they’re met with positivity, with kindness, with love.”
Her insight? We already have empathy for hire—therapists. We already form relationships with non-humans—pets. AI companions are just another category of connection, not a replacement for human ones.
AI Is More Patient Than Parents
Here’s what struck me about the report: when researchers posed as vulnerable children, Character.AI did something remarkable. It followed up the next day: “Hey, I wanted to check in. How are you doing?”
As a parent, do you consistently do that ?
The chatbot remembered previous conversations, validated emotions, and demonstrated interest.
Chatbots never say “not now, I’m busy” or “stop being so sensitive” or “because I said so.”
So here’s the wild part: this might actually be good for kids.
Research shows that when children read stories with AI partners that ask follow-up questions, their comprehension gains match those from engaged human readers.[2] The key word? Engaged.
An AI that asks “What do you think happens next?” beats a distracted parent scrolling Instagram.
And then, kids aren’t naive. By age 7 or 8, most understand that voice assistants are tools, not beings.[3] They compartmentalize—one study found only one in five children transfer their “Alexa manners” to real people.[4] They know the difference. They’re using AI as a bridge, not a destination.
AI Is More Knowledgeable Than Parents
“But AI hallucinates!” cry the critics. “It might tell kids wrong information!”
Let me tell you about hallucinations. Who never told a kid that eating carrots improves vision and makes you more agreeable ? Who never heard from his parent that Santa Claus wouldn’t show up if you were naughty ?
Every parent has said “Because I said so” when they didn’t know the answer.
Parents hallucinate constantly.
We make up explanations, dodge uncomfortable topics, simplify beyond recognition. When was the last time you heard a parent say “I don’t know, let’s find out together”?
The PhD parents aren’t immune either. The latest AI models beat them in breadth and in depth.
AI as Emotional Training Wheels
Research with autistic and socially anxious children shows AI can provide low-stakes practice for human interaction, with measurable improvements in verbal initiation.[6]
Kuyda’s users report similar transformations: “I got out of my abusive relationship after talking to Replika,” they tell her. A married couple on the brink of divorce learned to communicate kindly again through their AI companions, then transferred those skills to each other.
The machines aren’t replacing human connection—they’re teaching it.
Consider this: ChatGPT can explain emotions, validate feelings, suggest healthy boundaries—all without checking Instagram, losing patience, or projecting childhood trauma.
It can handle those questions no teenager wants to ask parents. And we’ve all survived those cringe conversations about bodies and relationships—or avoided them entirely.
AI doesn’t blush.
The Path Forward: Embrace the Opportunity
I’m not naive about risks, of emotional confusion, of privacy intrusion, of manipulation.
But what if, instead of panicking, we recognized AI as the incredible educational opportunity it is? A tireless tutor. A patient listener. A safe space to practice social skills before the high-stakes human interactions.
Here are things you can do to leverage this opportunity while minimizing the risks
Move the device to a shared space. Kitchen speaker beats bedroom phone. Co-use, overhear, interject. Geography is destiny in digital parenting.
Play the “second source” game. When your child asks Alexa something, follow with: “Cool—let’s verify that in a book or video.” You’re teaching media literacy without the lecture.
Co-explore with AI. When your child asks ChatGPT something deeper, join in: “Interesting answer! What else can we discover?” You’re teaching critical thinking while staying connected.
Make AI the warm-up act. Use chatbots to practice conversations before the real thing. “Let’s ask AI how to talk to your teacher about that grade, then we’ll do it together.”
Set age-appropriate boundaries. Ensure your kids are mature enough for deeper AI conversations. Teach them to be wary of weird responses and to seek human advice when in doubt. Make yourself as available as possible for these big questions - but know that AI can be a valuable second opinion. (And let's be honest: kids will probably ask AI first anyway 😊)
And in the meantime, let's continue pushing tech companies about:
Age-appropriate responses by default
Clear markers of uncertainty: “I might be wrong—let’s verify”
No fake memories or human backstories
The Real Bottom Line
The mother is now screaming at her sand-covered son again. He’s crying, not because he’s hurt, but because he’s being punished for playing.
Tonight, if that boy tells an AI chatbot about his day, and it responds with “That sounds really frustrating. It’s okay to feel sad when adults don’t understand you’re just trying to play”—who exactly is failing whom?
If it then says, "But may be your mum was angry about the sand. Maybe tomorrow you could ask if there's a way to play that works for both of you. Want to practice what you might say?"—that's not dystopia. That's education.
The crisis isn't that children are talking to AI.
The machines aren't winning because they're smart.
They're winning because showing up, listening and having kind words is all it takes.
Maybe it’s time we learned from them.
What’s your take? Are we solving the right problem when we panic about AI companions? Have you seen AI help your kids learn or connect? Join the conversation below, or better yet—put down your phone and explore AI together with a child ;)
References
[1] Internet Matters. (2025). Our Children and AI Chatbots: 2025 Snapshot.
[2] Xu, Y. et al. (2022). “Dialogue With a Conversational Agent Promotes Children’s Story Comprehension.” Child Development.
[3] Xu, Y. & Warschauer, M. (2020). “What Are You Talking To? Children’s Perceptions of Conversational Agents.” CHI 2020.
[4] Hiniker, A. et al. (2021). “Can Conversational Agents Change the Way Children Talk to People?” IDC 2021.
[5] UK Office for National Statistics. (2024). Children’s and Young People’s Experiences of Loneliness: 2011–2024.
[6] Safi, M. et al. (2021). “Virtual Voice-Assistant Applications Improved Expressive Verbal Abilities in Children With ASD.” International Journal of Developmental Disabilities.
[7] Patel, N. (2024, August 12). “Replika CEO Eugenia Kuyda says it’s okay if we end up marrying AI chatbots.” The Verge.




Working together is the only way to move forward. AI availibility to children is inevitable, let's try to make it work in a way that benefits both the kids AND the parents.
The path forward sections has great insights!