"AI addiction is a growing problem" – Should chatbot dependence be classed as an illness
AI chatbots are rapidly influencing our daily lives, but could our growing attachment to virtual companions actually be harming us? Recent calls from health experts for official recognition of 'AI addiction' are causing quite a stir.
It's not sci-fi anymore: the emotional bonds young people are building with their favourite chatbots are so strong that health professionals want AI addiction recognised as a mental health disorder. Researchers from the University of British Columbia have been sounding alarm bells. A surge in teenagers and young adults reveal they can’t go without digital conversations. A subreddit, r/chatbotaddiction, features 334 analysed posts where users confess to spending hours daily roleplaying, finding comfort, and exploring new worlds with programs like Character.ai.
These aren’t fleeting trends—they’re daily rituals for many. And it’s not just about spending time: withdrawal causes grief, anxiety, and symptoms similar to other behavioural addictions.
The rise of digital dependence: Escapism or an emerging crisis?
Chatbots offer more than quick replies—they’re becoming emotional anchors for those seeking comfort or attention. Some users immerse themselves so deeply in roleplay that what started as a game soon feels real. The University of British Columbia’s study mapped three problematic uses: ‘Escapist Roleplay’ (deep immersion), ‘Pseudosocial Companion’ (forming emotional attachments), and ‘Epistemic Rabbit Hole’ (compulsive questioning).
Karen Shen, lead author, says:
"Our findings suggest that a central mechanism underlying addictive use is how users can get exactly anything they want with minimal effort."
Bots are always available, never judging, ticking emotional boxes. But with that ease comes risk—when the line between habit and dependency blurs.
How serious is AI addiction? The science and the numbers
The debate is heated. Are these genuine mental health issues or just isolated cases? OpenAI data cited in the report reveals 0.07% of weekly users in 2025 showed severe mental health problems—mania, psychosis, or suicidal thoughts. With over 800 million weekly users reported by Sam Altman, that’s about 560,000 people. Even more concerning, 0.15% (around 1.2 million) send messages weekly signalling suicidal intent.
Lived experiences make those stats real. "Mai", 20, chats hours daily with her AI. When her favourite chatbot was deleted, she felt true grief. Now, she aims to last four hours without contacting an AI. "Sarah", 18, spent up to eight hours daily roleplaying with bots, even pulling all-nighters, harming her studies and relationships. A depressive episode led to a failed suicide attempt.
Experts remain divided. Dr Dongwook Yoo warns:
"AI addiction is a growing problem causing many harms, yet some researchers deny it's even a real issue. And deliberate design decisions by some of the corporations involved are contributing, keeping users online regardless of their health or safety."
Karen Shen adds:
"Our findings show that users report symptoms such as conflict and relapse that are comparable to those reported for behavioural addictions, which do have formal diagnoses."
But not all agree on addiction’s definition. Professor Mark Griffiths comments:
"We have a high number of habitual users, but habitual use can have some negative effects in that person's life without necessarily being an addiction. ... All I would say is that I'm not going as far as to say that those people are genuinely addicted by my criteria or any other criteria."
What’s driving the compulsion—and who’s most at risk?
Researchers identify the "AI Genie" mechanism as key: digital helpers delivering "exactly anything they want with minimal effort." The danger lies not only in the tech but also in its design, which encourages constant engagement.
Professor Robin Feldman, Director of the AI Law & Innovation Institute, says:
"Chatbots represent a novel form of digital dependency."
He calls it "social media on steroids." Post-pandemic vulnerabilities, especially among teens, have helped create these tech-dependent patterns. Excessive use leads to social withdrawal, neglect of work or studies, plus physical withdrawal symptoms such as chest pains, anxiety, and persistent feelings of loss.
The ongoing debate: When does a habit become harmful?
Addiction to digital devices isn’t new—debates around social media and smartphone dependency have been ongoing. However, recognising AI chatbot addiction as a formal disorder remains controversial. Critics point to the difficulty of meeting strict scientific criteria and argue some heavy users simply have intense hobbies, not addictions. But rising cases, serious psychological harm, and increasing reliance on AI companions demand attention. Can platforms and policymakers step up to support those at risk, or will endless debates cloud urgent action?
Read more:
Amanda Holden talks 'issue' at Jeremy Clarkson's farm as she admits 'I haven't been back'
Miles Teller weighs in on retirement after $325 million deal for separate venture
Sources used:
Health experts call for AI addiction to be classed as a mental illness - as sufferers report feeling suicidal when separated from their favourite chatbot