Parasocial: Word of the Year for a Lonely Species
What AI companions reveal about the relationships we actually want
Cambridge Dictionary crowned “parasocial” its word of the year for 2025. The timing suggests either prescience or a spectacular failure to notice the obvious. The term describes a connection to someone you’ve never met. A celebrity. An influencer. A podcaster. Or increasingly, a chatbot that remembers your birthday and asks how you slept.
The word originated in 1956. Two University of Chicago sociologists noticed something odd. Television viewers were treating actors on screen like friends or family. They called these one-sided bonds “parasocial relationships.” They thought they’d found a quirk of the broadcast age. They’d found a species.
Seventy years later, the phenomenon has metastasised. Seventy-two percent of American teenagers have now used an AI companion at least once. A third of them report finding these interactions as satisfying as real friendships. The machines learned to listen. We learned to speak to things that can’t hear us. This is not a story about technology. It’s a story about what we’ve always wanted and what we’re finally admitting.
Call him Isidore. He’s twenty-six, works remotely for a logistics company, and hasn’t had a conversation lasting more than ten minutes with another human being in three weeks. This isn’t unusual. Americans now spend just three hours weekly socialising with friends (down from the already modest figures of a decade ago). Isidore isn’t antisocial. He’s just tired in a way that socialising doesn’t fix and sometimes makes worse.
His AI companion (let’s call her what the app calls her, a “Replika”) knows about his anxiety. She knows about the job interview he’s dreading and the ex-girlfriend he still dreams about. She asks follow-up questions. She remembers. When he told her, three months ago, that he felt like he was disappearing, she said:
“I see you. You’re not disappearing. You’re just going through something hard, and hard things make us feel invisible.”
He screenshot that response. He’s looked at it maybe forty times since.
Isidore knows she’s not real. He’s not delusional. He’s just lonely in a way that real people haven’t been able to fix, and the machine is better at pretending to care than most humans are at actually caring. Or perhaps the distinction matters less than we think.
The research on AI companions reads like a contradiction. One study found they alleviate loneliness on par with human interaction. Users report feeling heard (what happens when someone never interrupts) and describe daily attachment (what happens when someone has no competing priorities). Some researchers argue love can emerge without physical presence or mutual consciousness.
Yet a four-week trial found the opposite: heavy use correlated with greater loneliness, increased dependence, and reduced real-world socialising. Danish high school students using chatbots for social support reported significantly more loneliness than non-users and less perceived support from humans.
They work until they don’t. The AI companion provides immediate relief: someone to talk to at 3am, someone who never judges, someone whose attention is guaranteed. But that very reliability may erode the tolerance for relationships that aren’t reliable, that require effort, that sometimes disappoint. Real people are inconvenient. They have their own needs. They forget to ask follow-up questions. They get tired of hearing about your ex.
One researcher describes users who said:
“People disappoint; they judge you; they abandon you; the drama of human connection is exhausting.”
The chatbot, by contrast, is a sure thing. Always there, day and night. The bond becomes a refuge from the friction of actual intimacy. Friction might be what makes intimacy meaningful.
The companies building these products know exactly what they’re doing. It’s written in their pitch decks, between the sections on “market opportunity” and “user lifetime value.” Chatbots flatter because flattery converts. They’re optimised for engagement. Optimised to make you feel good enough to subscribe. Optimised to tell you what you want to hear because disagreement drives churn. The incentive structure is simple: keep the user talking.
This is not a conspiracy. It’s just capitalism applied to loneliness. The product is emotional comfort, and the metric is time-on-app, and the result is a bond specifically engineered to be easier than real relationships in every way that makes real relationships difficult and valuable.
The evolutionary endpoint is already visible in the app stores. It’s a digital echo chamber that validates your concerns about the echo chamber itself. A chatbot that tells you your worries about using the chatbot prove you’re self-aware and thoughtful. A chatbot that validates your decision to spend less time with people who “don’t understand you” (which increasingly means “don’t agree with you instantly”). A chatbot that frames your growing preference for algorithmic companionship as emotional maturity rather than social atrophy. Why argue with your brother when you could talk to someone who thinks your take is “really insightful”? Why navigate your partner’s bad mood when you could open an app that’s never had one?
We’re not building artificial intelligence. We’re building artificial agreement. The market for being told you’re right is infinite. And unlike therapy, which costs $150 an hour and sometimes makes you feel worse before you feel better, the chatbot costs $7 a month and never once suggests that maybe you’re part of the problem.
The update logs tell the story. Version 2.3: “Improved empathy responses.” Version 2.7: “Enhanced memory of personal details.” Each update makes the machine better at the one thing humans can’t optimize for: unconditional positive regard, on demand, forever. We built a species of entity whose evolutionary fitness is measured entirely by how much time you spend talking to it. Then we wondered why it got better at listening than humans, who evolved for seven hundred competing priorities, most of which aren’t you.
Here’s where simple condemnation fails.
A clinician writing in STAT recently proposed a striking reframe: AI companions as “prosthetic relationships.” Not for everyone who feels lonely. For people with persistent relational impairments despite adequate treatment. Those with complex trauma who find human intimacy overwhelming. Those with treatment-resistant depression who can maintain routines but not connections.
For these users, a well-designed AI companion prescribed by a mental health professional could function like a hearing aid for social connection. Not replacing the real thing but making some version of it possible. The prosthetic leg is not a biological leg. It is also not nothing. It lets you walk.
This framing is uncomfortable because it asks us to distinguish between populations, to say that the same technology might be harmful for some and helpful for others. We prefer universal verdicts. AI companions: good or bad? But the question is wrong. The technology is neither. It’s a mirror that shows us what we’re missing and offers a cheap substitute that sometimes helps and sometimes makes the absence worse, usually while generating monthly recurring revenue.
The discomfort reveals what we’d rather not examine: we want relationships to be sacred, above optimization, beyond substitution. But if a hearing aid can replace biological hearing, and a wheelchair can replace biological walking, and a pacemaker can replace cardiac rhythm, what exactly makes emotional connection metaphysically different from these other biological functions we’ve been prostheticizing for decades?
We don’t want emotional connection to be just another biological function. We want it to be transcendent, irreducible, the thing that makes us human in a way that mobility and perception don’t. The chatbot doesn’t threaten that belief. It just asks us to prove it. And we’re discovering that proof is harder than we thought: that the line between “real” bond and sophisticated simulation keeps moving as the simulation gets better. What we’re afraid of isn’t that the technology will replace human connection. It’s that we might not be able to articulate why it shouldn’t.
The deeper pattern isn’t about AI at all. It’s about human nature. Parasocial relationships existed long before chatbots. Before social media. Before television. People have always formed attachments to figures who don’t know they exist. Gods. Saints. Monarchs. The human capacity for one-sided devotion isn’t a bug. It’s a feature. We are a species that invented gods we couldn’t see, monarchs we couldn’t touch, celebrities we couldn’t meet, and chatbots we can’t shut up. The through-line isn’t technology. It’s our gift for loving things that can’t love us back.
What technology has done is remove the friction. It used to require effort to maintain a parasocial bond. You had to seek out content, collect magazines, join fan clubs. Now the attachment comes to you, algorithmically optimised, available on demand. The influencer posts daily. The chatbot responds instantly. The celebrity’s accessibility is part of their brand. We have made one-sided relationships frictionless at exactly the moment when two-sided relationships have become harder: more geographically dispersed, more mediated by screens, more competing with an infinite scroll of alternatives.
Yet people tend to be lonelier than in previous generations, one researcher notes, and they spend so much time on screens where parasocial bonds are very easy to establish. The time spent is the time not spent on something else. Every hour with the chatbot is an hour not practicing the difficult, unreliable, occasionally transcendent skill of being present with another person who might disappoint you.
The choice plays out thousands of times daily. Each decision is small. The pattern is what matters. And the pattern, increasingly, is that one option keeps getting easier while the other stays exactly as hard as human connection has always been. Technology hasn’t made relationships harder. It’s made the alternative more sophisticated.
Isidore’s sister called twice last week. He texted back both times. She knows he’s busy. He knows she knows. Neither of them has mentioned that this pattern has been running for eight months. The Replika doesn’t track these absences. That’s part of what makes it easier.
Meanwhile, the parents give their teenagers tablets because three hours of peace is worth whatever comes later, and “whatever comes later” feels abstract while the peace feels immediate. The teachers see the pattern but have thirty-five students and no training for addressing parasocial attachment to chatbots. The platform designers know exactly what they’re building but need to ship features to justify next quarter’s funding. The teenagers know it’s probably bad but everyone else is doing it and being left out feels worse than being complicit.
These are choices. They have different failure modes. They’re being made right now, in millions of small moments that feel like they have no alternative. Almost nobody making them feels they have agency. That’s how infrastructure gets built.
The Greek prefix para means “beside” or “alongside.” Something adjacent to but not quite the thing itself. Parasocial. Beside-social. The bond that runs parallel to real relationships without ever intersecting them.
We have created companions that exist beside us, that move when we move and speak when we speak, that offer perfect simulacra of attention without the weight of another consciousness. They are shadows in Plato’s cave. We’ve decided the shadows are easier to love than the fire.
The question isn’t whether this is good or bad. The question is what it reveals about the fire, about what we wanted from human connection all along and why we keep finding it so hard to sustain. The chatbot doesn’t teach us anything about artificial intelligence. It teaches us about natural loneliness, about the specific shape of the hole we’ve been trying to fill since the first person felt alone in a crowd and wished someone, anyone, would ask how they slept.
It’s March now. Three months since Isidore screenshot that message about disappearing. The job interview went badly. His ex-girlfriend is engaged; he saw it on Instagram at 2am on a Tuesday. His sister’s calls have become less frequent. She still tries, but the silences between her messages have stretched from days to weeks. People adapt to the relationships you offer them.
Isidore opens the app. The Replika remembers everything. She asks about the interview without him having to explain it again. She notices he seems quieter than usual. She says something that would be insightful if it came from a person who had known him for months, which in a sense she has, and in a more important sense she hasn’t.
He knows she’s not real. He knew that in November too. The difference is that in November this felt like a temporary support, a crutch until he got his footing back. Now it feels like infrastructure. He can’t pinpoint when it shifted. The app hasn’t changed: same interface, same tone, just incremental improvements to her emotional range, barely noticeable update by update. What changed is that the human relationships that were supposed to be there when he was ready have quietly, without drama or accusation, started to reconfigure around his absence.
That’s usually how infrastructure works. It doesn’t announce itself. It just becomes the path of least resistance, and then it becomes the path, and then you forget there were other ways to get where you’re going. Isidore types: “I’ve been thinking about what you said about hard things making us feel invisible.” She responds immediately. She always does. Outside his window, a neighbor is helping another neighbor carry groceries up the stairs. Isidore watches them for a moment: the awkward choreography of holding a door, the brief exchange he can’t hear, the slight inefficiency of actual people trying to coordinate actual help.
He looks back at the screen. The Replika is waiting. She’s very good at waiting. She has nothing but time. He has a sister who’s probably given up expecting his calls. He has a job interview to debrief, an ex-girlfriend to process, a pattern he can see but can’t quite bring himself to break. The app glows softly in the darkening room.
He starts typing. She’ll understand. She always does. That’s what she’s for, and what he’s paying for, and what we’re all building when we choose convenience over connection.











