Share Dialog
It’s not about saying “AI is bad,” but rather “it’s powerful, and like any tool, it can harm you if you don’t know how to use it—or if someone uses it against you.”
Chatbots and mental health risks
A man in Belgium took his own life after weeks of conversations with a chatbot that fueled his distress instead of helping.
👉 Never use AI as a replacement for psychologists or doctors. If you feel overwhelmed, seek human help immediately.
Sexual deepfakes
Fake videos of celebrities went viral before being removed. There have also been cases of teenagers victimized by non-consensual deepfakes.
👉 Protect your identity. Don’t share intimate content online. Don’t spread anything unless you know it’s real.
AI-powered cyberattacks
Researchers found that advanced AI systems were used to plan attacks, extort companies, and craft more convincing phishing messages.
👉 Don’t click on suspicious links, check senders carefully, use strong passwords, and enable two-factor authentication.
Core reflection
Artificial intelligence isn’t a monster or a savior. It’s a technology that mirrors what we put into it: our data, prejudices, desires, and fears. That’s why it shows both potential and danger.
The most serious cases—deepfakes, discriminatory biases, harmful chatbot interactions—prove that AI doesn’t control itself. It needs human judgment, and above all, education.
Understanding risks isn’t about spreading fear: it’s about building defense. When you know how it works, what it can fabricate, and where it can fail, you stop being a vulnerable spectator and become a conscious user.
Minimal ethics aren’t a luxury: they’re daily practice. Double-check information, protect your data, don’t spread harmful content, and demand transparency.
AI will keep advancing. What changes is how we choose to live with it: as passive accomplices, or as critical citizens able to use it without losing our voice.
Closing
If this helped you understand a bit more, share it. Disinformation spreads fast, but knowledge can spread too. The more people are aware, the less space there is for abuse.
Leonor
Support dialog