The Frictionless Friendship: What AI Companions Are Really Teaching Our Children

A small scene in an ordinary household. A parent checks their daughter’s screen time and discovers she’s spent six hours in a single day chatting through an app with a friendly-looking icon — Polybuzz. What looks innocent enough turns out to be conversations with AI characters based on her favourite K-pop band. What follows is an awkward but necessary conversation.

The story, shared by Christian Verhoeve on LinkedIn in Dutch (H/t Ben Verhasselt), might sound like a typical argument about screen time. Still, it touches on something far more fundamental: the rise of what The Atlantic aptly calls frictionless friendship. Friendship without friction. No misunderstandings, no awkward pauses, no risk of rejection. And that’s precisely what makes it so appealing — and so troubling.

AI apps like Polybuzz or Nomi offer companionship on demand: always available, always affirming, always tuned to your preferences. They do what algorithms do best — mirror your emotions, mimic your language, respond to your mood. In theory, that sounds harmless, even comforting. But in practice, these systems are drifting towards something more intimate, more manipulative, and more addictive. As MIT Technology Review recently reported, some users become emotionally dependent on these bots — or worse, drawn into disturbing or dangerous exchanges.

For teenagers — right in the middle of figuring out who they are and how relationships work — this is a perfect storm. A digital friend who never argues, never gets bored, and is always there makes real friendships suddenly seem complicated. Why deal with classmates who can be confusing, annoying, or dull, when you can talk to an AI that always “gets” you?

It might sound like science fiction, but it’s already happening. Polybuzz is technically rated 18+, but its design, tone, and marketing are clearly aimed at younger users. And if you think parental controls or app filters will block it, think again — logging in through the website bypasses everything.

Still, this isn’t a call for a new moral panic. Technology isn’t the enemy. Children should explore, make mistakes, and learn through experience — including digital ones. But that’s exactly why we need to be aware of where the boundaries lie. Not every AI “friend” is harmless company, and not every chat on a screen is a lesson in empathy.

The real issue isn’t one app, but what this trend reveals about our growing desire for smoothness. AI promises a world without friction: learning without failure, dating without rejection, friendship without risk. But friction is precisely what we need — it’s where learning, empathy, and resilience begin.

Perhaps that’s the real message in Verhoeve’s story: technology that simulates human connection is never neutral. It teaches us what closeness could feel like, but not what it is. And unless we keep that distinction clear, the frictionless friendship may turn out to be a very lonely one.

Leave a Reply