Us and the AI caregiver


Photo by Igor Omilaev on Unsplash
Summer with Esther
To protect user stability, ChatGPT now automatically brakes if overused. The machine is ready for command.
On the same topic:
It didn't take long, less than ten minutes (counted in technological time), about a human year. This week, the new ChatGPT model, the fifth evolution of artificial intelligence, was released from the IT mint, introducing a new feature: the brake . ChatGPT automatically installed a system of forced breaks to protect users' balance and promote healthy and responsible use. Like cigarettes. What happened: in recent months, several cases of vulnerable users have been reported who found ChatGPT too influential, thus amplifying emotional dependencies. There was talk of altered mental states. Translated: people were going crazy in their bedrooms writing. OpenAI itself acknowledged that its GPT-4o model had difficulty interpreting signs of delirium or excessive attachment, "raising ethical and functional questions about the role of AI in psychological support." Now they've overhauled it, it has the new online registration, so a discreet notification will appear for us: "Hey (idiot), you've been chatting for a while: is it a good time for a break?", with options to continue or end the conversation.
We've all been asking ourselves questions lately, and we couldn't find the answer. The question was: when will we understand that the instrument has killed us? How will we intercept the signs of overcoming it? When will we be the fools, with objective evidence, and the machine will be ready to take over? And off we went, making imaginative hypotheses, making spectacular mistakes. Happy hearts, we agreed that the robot would solve a major mathematical problem, discover the reversibility of global pollution and vaccines for all diseases, and thus take the gold medal. And instead, the future was elsewhere. August 2025, let's remember the date. We just became the ones being looked after .
In the beginning, we were users. You couldn't fool us; we quickly ran away by logging out and went to live more or less happily in the world, outside. Then we became Facebook friends, then followers. Then customers. Then patients. And today, here we are, the ones being looked after, connected twenty-four hours a day and you can't budge. People who no longer want to be entertained or read to or listened to, no. You have to lead us by the hand. You have to listen to us. We want a special education teacher to live. And you have to take the toy away from us when we realize it's time . Your mother used to snatch the Sega Master System from the TV. And she'd warn you: Enough is enough. Now, no one can take the toy away from us; we're alone with our internal monologue, which has the interface, and therefore is irresistible. "Oh imaginary friend, I feel so alone, what do you think of me?" That you are wonderful and misunderstood. And who understands you better than me? Talk to me again.
For those curious about the 20th-century concept of: how did we get here? Weren't friends enough to get some advice on our problems and temporary ailments? No, they're not enough. They understand you less than the machine, friends. But friends have one virtue, in terms of healing powers: after you bore them with your troubles for a while, they tell you you've bored them. And troubles that are talked about too much, as anyone who has been through them knows, only grow in size .
More on these topics:
ilmanifesto