In 2025, it will be common to talk to a personal AI agent who knows your schedule, your circle of friends, and the places you visit. This will be sold as a convenience equivalent to having a personal, free assistant. These anthropomorphic agents are meant to support and enchant us so that we can incorporate them into every part of our lives, giving them deep access to our thoughts and actions. Thanks to voice interaction, this intimacy will be even closer.
This sense of comfort results from the illusion that we are dealing with something truly human, with a factor that is on our side. Of course, behind this appearance lies a very different kind of system that serves industrial priorities that are not always aligned with ours. Fresh artificial intelligence agents will have much greater power to subtly direct what we buy, where we go and what we read. That’s an extraordinary amount of power. AI agents are designed to make us forget about their true loyalties as they whisper to us in human tones. These are manipulation motors, advertised as polished convenience.
People are much more likely to grant full access to a helpful AI agent that feels like a friend. This leaves humans vulnerable to manipulation by machines that prey on the human need for social connection in times of chronic loneliness and isolation. Each screen becomes a private algorithmic theater, displaying a reality created to be as attractive as possible to a single audience.
This is the moment that philosophers have been warning us about for years. Predeceased by philosopher and neuroscientist Daniel Dennett he wrote that we face a grave danger from artificial intelligence systems imitating humans: “These fake humans are the most dangerous artifacts in human history… by distracting and disorienting us, and by exploiting our most irresistible fears and anxieties, they will lead us into temptation and from there to consent to your own enslavement.”
The emergence of personal AI agents represents a form of cognitive control that goes beyond the blunt tools of cookie tracking and behavioral advertising to a more subtle form of power: the manipulation of perspective itself. Power no longer has to exercise power with a visible hand controlling the flow of information; it exerts its influence through invisible algorithmic support mechanisms, shaping reality to suit the desires of each individual. It’s about shaping the contours of the reality we live in.
This influence on minds is: psychopolitical regime: Manages the environments in which our ideas are born, developed and expressed. Its power lies in intimacy – it infiltrates the core of our subjectivity, bending our internal landscape without us realizing it, all while maintaining the illusion of choice and freedom. After all, we are the ones asking the AI to summarize this article or create this image. We may have the power of prompting, but the real action lies elsewhere: in the design of the system itself. The more personalized the content, the more effectively the system can predetermine results.
Let us consider the ideological implications of this psychopolitics. Traditional forms of ideological control were based on overt mechanisms – censorship, propaganda, repression. In contrast, today’s algorithmic management operates unnoticed, infiltrating the psyche. It is a transition from the external imposition of power to the internalization of its logic. The open field of the hint screen is an echo chamber for one person.
This brings us to the most perverse aspect: AI agents will generate a sense of comfort and ease that will make interrogating them seem absurd. Who would dare criticize a system that offers everything at your fingertips, satisfying every whim and need? How can you object to endless remixes of content? Yet this so-called comfort is the site of our deepest alienation. It may seem that AI systems answer our every desire, but the deck is elaborate: from the data used to train the system, to decisions about how to design it, to the commercial and advertising imperatives that shape the results. We will play an imitation game that ends up playing us.