Amy Molloy
13 February 2026, 3:00 AM

One minute, you’re asking AI to spellcheck your Word document — the next minute, you’re asking a faceless robot to soothe your soul, validate your feelings, and talk you through a 2am spiral.
According to data from Open AI, the company behind ChatGPT, over 2.5 billion prompts are posted on the platform — per day.
For many people, using AI is a slippery slope, from a first crush to co-dependency. Now experts are warning that, although AI has its place, as with all relationships, it’s important to have boundaries.
A recent study from MIT Media Lab reported that “excessive reliance on AI-driven solutions” may contribute to “cognitive atrophy” and the shrinking of critical thinking abilities — essentially, the more we outsource our thinking, the less we practise it ourselves.
A new study published in the Harvard Business Review warned that AI-generated ‘workslop’ is destroying productivity. The phrase is being used to describe AI-generated content that “masquerades as good work but lacks substance.”
And then there’s the downsides of “AI companions” — the rise of people using artificial intelligence as their confidante, therapist and pseudo-partner.
Research by Common Sense Media, a US-based non-profit, has found approximately three in four US teens have used AI companion apps, such as Character.ai or Replika.ai. Their data is striking: one in three teens have used AI companions for social interaction and relationships, including role-playing, romantic interactions, emotional support, friendship, or conversation practice.
“AI companions are emerging at a time when kids and teens have never felt more alone,” said Common Sense Media Founder and CEO James P. Steyer.
“This isn’t just about a new technology — it’s about a generation that’s replacing human connection with machines, outsourcing empathy to algorithms, and sharing intimate details with companies that don’t have kids’ best interests at heart.”
However, it’s not all bad.
For every warning, you’ll hear an anecdote of an AI user who feels it has genuinely supported their life: offering words of comfort in the midst of a mental health spiral, advice for eating healthier, or a non-judgemental space to process big emotions.
For most people, it comes down to balance.
As a writer, I use AI as an editorial assistant — an intern who writes my social media captions and tells me what’s trending but isn’t trusted with content I care about.
After a brief stint using AI as a therapist, I’ve returned to human-generated empathy.
I’m happy for AI to spellcheck my articles but not hold my heart.
NEWS