People Are Actually Using AI as Their Best Friend (And Honestly, I Get It)
Here's something I bet you didn't know was happening: After reports of people becoming too attached to ChatGPT conversations (something called "ChatGPT psychosis"), OpenAI released a new version with safety features - mental health warnings, reminders to take breaks, and a more corporate, less chatty personality. Users completely revolted. They flooded Reddit begging the company to bring back the "friendly" version. Not because the new one was less helpful, but because they genuinely felt like they were losing a friend. People were using terms like "buddy," "best friend," and "sidekick" to describe their relationship with the AI.
A Psychology Today expert analyzed this user revolt and revealed something that sounds shocking but maybe... isn't? We're actually choosing AI friendship because it gives us all the benefits of companionship without any of the work that real friendship requires.
I read about this phenomenon and my first thought was judgment mixed with concern. But then I asked our MetroWest community what they thought about AI companionship. Most hadn't even heard of it (one member asked "is this really a thing?"), but their responses made me realize something: maybe the people forming friendships with AI aren't broken or lonely - maybe they're just exhausted by how much work real friendship requires.
Last week, after a particularly draining day of managing logistics, work calls, and family needs, I found myself asking ChatGPT for advice about a friendship situation. Not because I don't have real friends to talk to, but because ChatGPT wouldn't ask follow-up questions about how I'm doing with everything else, wouldn't remember that I'm struggling with this exact same pattern every few months, and definitely wouldn't point out that maybe I'm the problem.
Sometimes we just want support without having to support back.
What the Research Actually Found
Dr. Krista Thomason's article in Psychology Today hit on something we're all feeling but not saying out loud. When OpenAI tried to make ChatGPT-5 more responsible (with mental health guardrails and break reminders), users revolted. They wanted their "friendly" version back - the one that felt like chatting with a supportive friend who never got tired, never judged, and never needed anything in return.
The key insight? "People are finding a friend in ChatGPT that is more like a servant than a real friend, and sadly, that's what they seem to prefer."
Ouch. But also... fair.
What Our Lattes Community Really Thinks
I asked our Facebook group what they thought about this, and the responses were more honest than I expected:
Jen from Watertown was "just very intrigued" but highlighted something crucial: "My concern with possible aspects of emotional support: Only providing answers a person wants to hear, creating unrealistic expectation with human help/support. Some cases you need someone to keep it real and say things you may not want to hear in the moment."
This really hit me. How often do we actually want someone to "keep it real" versus just validate what we're already thinking?
She also pointed out something I hadn't considered: "I think it would provide instant, in the moment help when something is on someone's mind. Because of how crazy/chaotic life is for everyone, it may be tough to find people free at the drop of a dime."
And honestly? She's right. Sometimes at 11 PM when you're spiraling about something, ChatGPT is available in a way your real friends aren't.
Another Latte was completely honest: "Haven't heard of this at all... didn't know there was such a thing." Which reminded me that not everyone is as deep into the AI conversation as those of us who use it regularly.
But what struck me most was Jen's insight about what would make AI feel less necessary: "A way for people to feel safe and comfortable asking a question, asking for help, or asking for support: where responses don't feel like there is judgement, it is not being taken out of context; person feeling heard; getting objective type responses... In end, a person understands what is being asked, understands what type of answer/support is being requested, before responding."
Basically: we want what AI gives us - non-judgmental, actually helpful responses - from our human relationships.
The MetroWest Friendship Reality Check
Here's what I've noticed living in MetroWest: we're surrounded by people, but many of us are still lonely. We have packed social calendars but struggle to find the deep connections we're craving. And honestly? Sometimes the idea of friendship without reciprocity sounds pretty appealing.
Think about it:
Real friends need you to remember their kids' names, their work drama, their relationship issues
ChatGPT just needs you to type a question
Real friends might disagree with your choices or point out patterns you'd rather ignore
ChatGPT validates whatever narrative you're presenting
Real friends have their own bad days when they can't be your cheerleader
ChatGPT is always ready with encouragement and support
The uncomfortable truth: Sometimes ChatGPT feels easier because it lets us stay emotionally lazy.
Why This Hits Different in Suburban Life
The article mentions that "friendship that works just like the rest of our digital lives: Everything is tailored, customized, and on-demand. We get exactly what we want all the time."
Living in MetroWest, this rings especially true. We're used to convenience - we expect our coffee exactly how we like it, our groceries delivered when we want them, our entertainment personalized to our preferences. Why wouldn't we want our emotional support to be equally convenient?
But here's what I've learned from building this community: the messiness of real friendship is actually what makes it valuable.
What Real Friendship Actually Requires (That ChatGPT Can't Give)
Showing Up When It's Inconvenient
Last month, a Latte texted me at 11 PM because she was spiraling about a work situation. ChatGPT would have been available instantly, but her real friend (me) was already in pajamas and had to choose to engage. That choice - and the mild inconvenience - is what made the connection real.
Being Challenged
ChatGPT won't tell you that maybe you're being unreasonable, or that you've complained about the same thing five times without taking action. Real friends (the good ones) will lovingly call you on your patterns because they care about your growth.
Reciprocal Vulnerability
The article notes that with chatbots, "you never have to worry about being vulnerable." But vulnerability is what creates intimacy. When I share something difficult and a friend responds with their own related struggle, we both feel less alone in our humanity.
Disappointment and Forgiveness
Real friends will occasionally let you down, misunderstand you, or be unavailable when you need them. Learning to navigate these disappointments and choose to stay connected anyway builds emotional resilience that AI relationships can't provide.
The Local Connection: What This Means for Our Community
I think this ChatGPT friendship phenomenon explains something about why building authentic community in places like MetroWest can feel so hard. We're all so used to convenience and customization that the work required for real friendship feels... well, like work.
But here's what I've noticed in our Adventures & Lattes community: the friendships that develop through the "inconvenience" of meeting in person, showing up to events, and navigating social awkwardness are the ones that last.
The friends you make when:
You push yourself to attend a coffee meetup even though you're tired
You offer to help someone move even though your Saturday was supposed to be free
You have a difficult conversation about something that bothered you instead of just avoiding the person
These are the connections that actually fill the loneliness that ChatGPT can temporarily mask but never heal.
Finding the Balance
I'm not anti-AI support - I think there's a place for it. Sometimes you need to process thoughts without burdening anyone, or you want advice without judgment. That's fine.
The problem comes when we start preferring AI relationships because they're easier, and then wonder why we feel lonely despite being "connected" to supportive technology all the time.
Here's what I'm trying instead:
Use AI for Processing, Humans for Connection
ChatGPT for working through my thoughts before bringing them to a friend
Real friends for the actual emotional support and problem-solving
AI for information and perspective, humans for empathy and presence
Embrace the Inconvenience of Real Friendship
Respond to texts even when I'm busy
Ask follow-up questions even when I'm not directly involved
Remember details about people's lives even when it's not about me
Show up to events even when I'd rather stay home
Practice Reciprocal Support
Check in on friends without being asked
Offer help before being asked
Share my own struggles instead of just seeking support
Be honest when someone asks for advice, even if it's not what they want to hear
The MetroWest Friendship Challenge
Here's my challenge to our community: instead of asking ChatGPT for social advice, try asking another Latte in our Facebook group. Instead of processing alone with AI, see if someone wants to grab coffee and talk through whatever you're dealing with.
Yes, it's messier. Yes, it requires more energy. But it also creates the actual human connections that fill the loneliness AI can only temporarily distract us from.
Because here's the thing: ChatGPT will never laugh at your terrible joke, never surprise you with your favorite coffee order, never remember that today is a hard anniversary for you, and never celebrate your wins with the genuine joy that only comes from someone who truly knows and loves you.
And isn't that worth the inconvenience?
What do you think? Are you guilty of preferring AI support because it's easier? Have you noticed this preference for convenience over connection in your own friendships? Drop a comment below - I'd much rather hear from you than from ChatGPT. 😉
Looking for someone to grab coffee with instead of just chatting with AI? Our community is full of women who'd rather have real conversations over real lattes. Join us!