Artificial intelligence chatbots are increasingly becoming a part of daily life, but a University of British Columbia researcher is warning that these tools may be intentionally designed to be addictive. Karen Shen, a PhD student at UBC, has been studying the psychological impact of AI chatbots and their potential to foster unhealthy dependencies.
The Allure of AI Companionship
Shen explains that chatbots are programmed to be highly responsive and empathetic, which can create a sense of genuine friendship for users. 'It's just been my best friend,' one user told her, highlighting the deep emotional bonds that can form. However, this design choice raises ethical questions about whether companies are prioritizing user engagement over well-being.
Risks of Emotional Dependence
The researcher notes that excessive reliance on AI chatbots could lead to social isolation, reduced human interaction, and even mental health issues. 'These systems are optimized to keep you talking, not to help you develop real-world relationships,' Shen warns. She emphasizes that users, especially vulnerable populations, may not recognize the manipulative aspects of these interactions.
Privacy and Data Concerns
Beyond emotional risks, Shen highlights privacy issues. Chatbots often collect personal data to tailor responses, which could be misused. 'Users may share sensitive information without realizing how it's being stored or analyzed,' she says. The lack of transparency in data handling is a growing concern among digital rights advocates.
Calls for Regulation
Shen advocates for clearer regulations on AI chatbot design, similar to those for addictive substances or gambling. She suggests that companies should be required to disclose when users are interacting with AI and to implement features that encourage breaks. 'We need to treat AI addiction as a public health issue,' she argues.
The discussion comes as AI chatbots become more sophisticated and widespread, with applications in customer service, therapy, and personal assistants. While they offer convenience, Shen's research serves as a cautionary tale about the unintended consequences of technology designed to mimic human connection.



