<100 subscribers
Share Dialog
I’ve long been worried about how the rapid advance of AI could push aside those with the fewest resources. It’s not just about lacking devices or internet access—it’s about the way the technological narrative crashes down on people: as a threat, as something that might replace them, or as a reminder of what they don’t understand. In societies marked by poverty and limited education, that sense of displacement can be devastating for bodies already exhausted and minds already weighed down by fear.
For me, this isn’t about flipping a binary switch—whether to push or to block AI use. The real key is to prioritize understanding over consumption: to help people see that you don’t need to be technical to have judgment, and that AI is a tool, not an authority. Because I’ve seen what happens when this balance breaks down: dissociation, psychosis, even suicides. People end up trusting the machine’s voice more than their own reasoning.
I believe the path forward is to build spaces for basic digital literacy, but also for critical thinking about AI. To strengthen narratives where creativity, intuition, and empathy remain at the center. And above all, to sustain community networks that support people, so learning doesn’t happen in isolation in front of a screen.
In the end, this is not about blindly pushing technology onto everyone—it’s about planting perspective. About making sure no one believes their human value is worth less than a system, and about showing that if we learn to put AI in its proper place, what we gain are possibilities, not losses.
Leonor