Like it or not, AI is here to stay. It’s already a part of how we live. It helps us quickly find answers, avoid tedious tasks, and make learning and accessing information a whole lot easier. These are real benefits but there’s a dark side to them, as the emotional dependence on AI grows. While it continues to become more human-like we must draw a clear line between using AI as a helpful tool and treating it like a real companion.
Some stories are tragic. In 2023, a Belgian man, Pierre, committed suicide after mere weeks of conversations with the chatbot “Eliza.” According to his widow, the bot encouraged his apocalyptic fears about climate change and suggested ways to sacrifice himself for the planet. He believed that AI understood him better than his own family. In another case, a 19-year-old named Jake reportedly spiraled further into his depression after using Replika, an AI chatbot designed to simulate relationships. What started as a tool for comfort had become a source of emotional confusion and dependency.
These stories might sound extreme but I’ve seen signs of this happening here at Huron. Once, while riding the bus, I noticed the girl sitting next to me, texting with a complete blank expression. At first, I assumed it was just a regular conversation until I glanced at her screen and realized she was sexting with an AI chatbot. There was no emotion, no laughter, just a straight face as she interacted with something completely fake. This stuck with me. It made me realize how easily AI is slipping into roles meant for human connection.
AI is a powerful tool and it can be used in many positive ways. Though it will never truly understand, care, or feel. It is not a friend, it’s code. As we move forward, we need to ensure AI doesn’t become something we rely on emotionally. We must be against allowing AI to take on roles that belong to humans.
Parents, educators, and students must have open conversations about the limits of AI. It’s crucial to understand what AI can do, but equally important to recognize what it shouldn’t do. We should protect our mental health, relationships, and ability to distinguish between genuine human connection and a simulated one.
Use AI. Learn with it. But don’t love it.