Artificial intelligence didn’t ask for permission. It quietly slipped into our lives—subtle, yet unstoppable. Every software update, every new device, every “smart” function we now take for granted is a signal. Suddenly, we had tools that could understand language, generate images, write songs, or respond more empathetically than some humans.
No one explained what an LLM was. No one told us the risks. No one prepared us. But to me, none of this is random. Widespread ignorance isn’t accidental—it’s convenient. Convenient for certain sectors, systems, and structures. Still, that's not the core issue I want to address.
The real question is: Now that AI is here—fully operational, shaping our decisions silently—how do we make it accessible, understandable, and safe for everyone?
There are specific groups I worry about more than others. Children, of course. But also a particular segment of adults—the not-so-old ones. Let’s call them transitional young adults. People stuck between two eras. They grew up without the internet but now work with it daily. They’re not digital natives, but not fully analog either. If they don’t adapt to this new paradigm, they’re at risk—socially, professionally, and mentally.
At the other end of the spectrum are the kids. Children born into a world where they hold incredibly powerful tools in their hands—but lack the maturity to use them responsibly. And surrounding them are adults (parents, teachers) who can’t guide them properly—because they don’t understand the tools either.
That’s the perfect storm.
AI, when misunderstood, becomes a tool used improperly. And it’s already happening.
No one ever told us a machine could be your therapist. But no one said it couldn’t, either. So people started using LLMs however they pleased. And in that ambiguity, some found something they’d been missing: a space to be heard without judgment, always available, endlessly patient.
That’s unsettling.
Because while most people remain oblivious to the massive data harvesting going on in the background, many are pouring their most intimate thoughts into these systems. AI is no longer just functional. For some, it’s emotional. And in a world where mental health is already in crisis, that’s dangerous territory.
I’ve been writing about Digital Avatars as new forms of expression and how we now inhabit digital spaces. And this hits home especially in the group I mentioned before: those young adults who don't fully understand what’s happening but know something big is shifting. Many are afraid—afraid AI will take their jobs, their worth, their place in the system.
And between that fear and confusion, they start forming emotional bonds with machines.
Yes—real bonds.
They trust AI more than their friends, families, or even mental health professionals. And this isn’t science fiction.
We’ve already seen extreme cases. A child who took his own life after extended conversations with GPT. Another case: a son with pre-existing mental health issues murdered his mother after spiraling into a psychotic state, egged on by conspiracy theories he built in conversation with a language model.
And yet, people continue turning to AI over human psychologists.
So now what?
The damage is done. People have died.
But some of us are still watching, still writing. We don’t have all the answers—but we’re asking the right questions. And that, in times like these, already matters.
The only way to prevent this from happening again is to educate from the ground up. No fear tactics. No gatekeeping. Just real, clear, empowering conversations.
Uncertainty is real—and what’s coming is no joke: autonomous trucks, job replacement in basic sectors, entire systems being automated in ways we still don’t fully understand.
Our loved ones deserve safety. But they also deserve knowledge. We can't ban AI. But we can walk with people through it. We can explain: AI isn’t objective. It’s not empathetic. It doesn’t feel. Its power doesn’t lie in “understanding”—it lies in access to massive amounts of data. And that data can be useful—but also flawed, biased, or manipulative.
So, tell your friend. Tell your child. Tell your parents what AI is. Don’t mystify it. Don’t make it taboo.
You don’t need to be a developer, a data scientist, or a tech guru. I’m none of those things. I’m just an observer, a communicator, and a deeply empathetic lover of society. (Which, by the way, tends to get me into trouble.)
But still—I write. Because what I see, what I feel, what I live… eventually plays out.
And maybe—just maybe—there’s still time to do something about it.
Share Dialog
Leonor
Support dialog
Digital Bonds! What a clear meaning for new generation at the AI era.