Researchers, teachers, and mental health professionals alike have spent the past few years reeling as teens and young adults exported their brains to AI chatbots — so it should come as no surprise they’re now using the tech as a crutch to sidestep hard conversations they don’t want to have.
New reporting by CNN details the troubling rise of young people using AI models like ChatGPT to step in for them during life’s delicate moments.
One Yale University student identified as Patrick, for example, used ChatGPT to reject a girl he had met through some mutual friends. “Hey Emily! I hope your half-marathon went well — I’m sure you crushed it,” Patrick began.
The ensuing text, six paragraphs long and chock full of ChatGPTisms, may be the perfect distillation of 21st century cringe. In it, the AI’s version of Patrick said it’d be cool to “hang out more — whether it’s just as friends or whatever it was we were this weekend,” with the caveat that he isn’t “looking for anything too serious right now.”
“It just seemed really proper, and I guess I knew that he was a really nice guy. So, I was just like, maybe this is just how he texts,” Emily told CNN. After showing it to a few friends, however, she was convinced “it was like, 99 percent AI.”
When confronted, Patrick admitted to offloading his rejection text with the AI chatbot, saying he was inexperienced in crafting that kind of touchy prose. “I knew if I did it on my own, I would have been wishy-washy,” he said.
There’s already a term among researchers for the phenomenon: “social offloading.” According to an upcoming study to be published in the Journal of Experimental Child Psychology, it applies to “any shared task-based situation in which an individual is able to leverage agents in the social world… to facilitate their own cognitive performance.”
In other words, social offloading is exactly what Patrick did when he asked ChatGPT to reason through the rejection for him.
While social offloading doesn’t just involve AI — as in the case of children offloading to adults, the thrust of that aforementioned study — the rise of chatbots has brought the phenomenon into the domain of human-computer interaction.
“If you’re using AI to draft your messages to friends or romantic partners, you’re outsourcing the communicative act itself,” Michael Robb, head of research at Common Sense Media, told CNN. “I have seen young people, late teens, early 20s, using AI to socialize, and oftentimes they’re using it as a way to overcompensate for the fact that they don’t really know how to truly interact with others.”
That has some grim implications for future human development, as Robb explains: “if every tricky or difficult text is mediated by the AI, it may instill the belief in users that their own words and instincts are never good enough.”
More on AI chatbots: Study Finds That Execs Are Already Outsourcing Their Thinking to AI
The post Gen Z Is Using AI to Have Difficult Relationship Conversations, and the Results Are Massively Cringe appeared first on Futurism.


