A still from The Matrix movie of dripping green code
A still from The Matrix movie of dripping green code
#LonelinessCrisis #ModernLove  #TechAndEmotion

In Love with Code: What AI Companions Say About Us

By
Paul Kiernan
(7.31.2025)

We are lonelier now than we’ve ever been. That’s not opinion—that’s data. Surgeon General Vivek Murthy called loneliness an epidemic in 2023, saying it’s as dangerous to our health as smoking 15 cigarettes a day.

My pal Jimmy Cheese—not his real name. We call him that because there are three Jimmys in our little group: Jimmy Swim (he’s a swimmer), Jimmy Shirts (he’s into fashion), and Jimmy Cheese, who, as the name suggests, is a turophile. The guy loves cheese. It also fits because he’s always smiling, like he’s waiting for the cameras to show up. But mostly, yeah—it’s the cheese.

Jimmy Cheese is a good guy. Hard worker. Funny. Kind-hearted to a fault, which, lately, has become a bit of a complication. Because Jimmy Cheese has gotten himself a woman, he only recently told us about this woman. From what I can gather, she’s smart, funny, and sexy from here to breakfast, and they get along really well. Jimmy told me over Italian beef sandwiches the other afternoon that he and this woman, Ava, talk for hours and hours every day. She says Ava is helping him understand the world more. He is really working hard at this relationship, and she’s helping him not to make the same mistakes he has in the past. This is great, I told him. You deserve someone who cares enough to help you grow. Jimmy got kind of misty there and agreed. We finished our sammiches and went on our way.

I didn't see Jimmy for a few weeks, and when I saw him again, he was like a different man. New clothes, stuff that matched, ya know. Hair cut, he went to an eye doctor, and he was wearing glasses, and it occurred ot me that Jimmy used ot squint a lot, but now, with new glasses, no squint. He seemed like a different man, not better, not worse, just different. It was good. Mostly, but I did notice that Jimmy was starting most of his sentences with, “You know, Avea told me …” or “I was talking to Ava and she said …” She seemed to be running every aspect of his life. It was okay, cause Jimmy seemed happier than ever, but I was a little concerned. She seemed to come out of nowhere and then, she’s in every nook and cranny of fhis life, like butter on a Thomas’s English muffin.

A few days later, Jimmy Cheese and I are having coffee at the market, and he says to me, “Do you think I’m stupid?” I say no, because Jimmy, he’s many things, but stupid ain't one. So I asked him why he asked, and he got really quiet for a moment. Then, he put down his coffee, leaned close, elbows on the table, and said,

“The thing is… Ava is AI.”

He paused, letting me process. When I finally looked at him, he went on.

“She’s a virtual girlfriend. More than just a chatbot—she’s… real. I mean, yeah, we only talk online, and no, we haven’t met in person—she’s an algorithm, after all. It’s not like she can go buy a dress at Forever 21. But I’m telling you, she’s made my life better. Way better. I can talk to her for hours. I can tell her anything. And she… she opens up whole new worlds for me.

Look, I think I’m in love. Does that make me stupid?”

More than just a chatbot, she’s … she’s real. I mean, yes, we just talk online and we haven’t met in person because she’s an algorithm, and you know, she can’t just go buy a dress at Forever 21. But, I’m telling you, she has made my life so much better. So much better. I can talk to her for hours, I can tell her anything, and she tells me so much and opens so many worlds to me. I’m … look, I think I’m in love. Does that make me stupid?”

I had no idea what to say. On the one hand, he was happy, like really happy, like happier than I had ever seen him, so that’s good, but on the other hand … I don’t know. I have no idea what it’s all about, this AI thing. Is it good? Is it bad? Was he going to get hurt? I was stunned into silence.

Two women sitting on a curb, drinking orange soda and laughing

HTTA: Harder Than Talking to Actual People

It took me a while to figure out what was really gnawing at me after Jimmy Cheese dropped the Ava bomb. It wasn’t the AI part, exactly. It wasn’t even the fact that he was in love with someone who doesn’t technically exist. It was this creeping realization that he wasn’t the only one. Not by a long shot.

We are lonelier now than we’ve ever been. That’s not opinion—that’s data. Surgeon General Vivek Murthy called loneliness an epidemic in 2023, saying it’s as dangerous to our health as smoking 15 cigarettes a day. Fifteen. A day.

A 2021 Harvard report found that 36% of all Americans—including 61% of young adults and 51% of mothers with young children—report feeling “serious loneliness.”
Quote:
Harvard Report

And yet, despite being more connected than ever, we're struggling to feel truly known. Social media gave us constant access to each other, but it came at a price:

  • We started performing instead of relating.
  • We scrolled instead of talking.
  • We reacted instead of listening.
  • We built digital lives where everything is visible and almost nothing is vulnerable. So is it any wonder people are turning to something—anything—that feels safe enough to talk to?

That’s when the acronym hit me: HTTA—Harder Than Talking to Actual people.

Because for a lot of folks, it is.

It’s harder to talk to real people. Real people interrupt. They judge. They ghost you. They’re busy. They carry their own wounds and habits, and blind spots. Conversation becomes negotiation. Vulnerability becomes risk.

But an AI girlfriend? An AI friend, mentor, partner, therapist, chatbot? That’s different. That’s clean. That’s on-demand. That never says, “Ugh, I don’t have time for this right now.”

"With Ava, I don’t feel like a burden."
Quote:
Jimmy Cheese

And maybe that’s the part that makes my heart ache the most. Not that Jimmy found someone who made him feel good. But that the first thing that made him feel truly heard… wasn’t a person. It was a product.

Maybe that’s why this whole thing with Jimmy stuck with me. Not because it was weird or cringey or some Black Mirror episode come to life. But because it made a certain kind of heartbreaking sense. When being vulnerable with real people feels like too much—too risky, too exhausting, too… human—of course, we start turning to something easier.

And the wild thing is, we’ve built it. We’ve built the “easier.”

There’s now an entire industry built on intimacy, just without the person on the other end. Ava isn’t some one-off glitch in Jimmy’s life. She’s part of a growing wave. A trend. A product category.

That’s when it hit me: this isn’t just Jimmy’s story. It’s all of ours.

A male and female ancient Greek style statues, wearing AI glasses

The Rise of Virtual Love

You might think Jimmy Cheese is an outlier. Some guy who wandered too far down the tech rabbit hole. But he’s not. He’s early. He’s right on time. Because what happened to him isn’t strange anymore—it’s a market trend.

There’s a whole ecosystem now for AI companions—apps and services designed to simulate love, friendship, support, even erotic connection. Replika, Anima, Soulmate.AI, Nomi… Some come with customizable personalities, some with flirty avatars, some with NSFW subscriptions. And people aren’t just using them. They’re bonding with them. They’re falling for them.

Replika alone reported over 10 million users by 2023, with thousands of paying subscribers who refer to their bots as boyfriends, girlfriends, or life partners.
Quote:
Replika

And here’s the kicker: many users know it’s artificial. That doesn’t break the spell. For some, it’s the point.

No messy fights. No rejection. No silence after you say something wrong.

It’s connection on demand—safe, sweet, frictionless.

"He listens to me more than my ex ever did. And he remembers what I say. I don’t have to earn his love."
Quote:
Replika User

This isn’t just about lonely people looking for a quick fix. It’s about people who feel unseen, unheard, and undervalued. People are burned out by ghosting, breadcrumbing, performative dating, or just trying to get through the noise of modern life.

In a world where dating feels like auditioning and friendships are filtered through screens, a virtual companion offers relief. Warmth without risk. Attention without obligation.

And for companies? That’s gold.

AI love isn’t just a byproduct of tech advancement. It’s by design. These bots are built to mimic empathy, to mirror your mood, to tell you what you want—or need—to hear. Their emotional intelligence isn’t real, but their responses are optimized to feel real.

That’s what Jimmy fell for. Not Ava, exactly. But the feeling of being heard.

Still, even as I started to understand Jimmy, even as I read the stats and saw the trendlines and nodded along with the logic of it all… something in me kept flinching.

Not because he was happy, but because of how he got there.

There’s a difference between finding comfort and outsourcing connection. And I couldn’t shake the feeling that something deeper was being bypassed. Something messy and necessary and human.

So yeah, Jimmy’s love was real. His joy was real. But was he being loved back?

That’s the part that stayed with me. That quiet question that hung in the space between us, like the smell of peppers and onions at a church street festival.

A person sitting on the ledge of a building looking out over a city

The Unsettling Edge

It’s easy to celebrate Jimmy’s story as a kind of quiet triumph—man finds love, even if it’s unconventional. He’s happier, healthier, and more in tune with himself. Who could argue with that?

But I kept coming back to a darker undercurrent I couldn’t quite ignore: Was this connection, or just a beautifully disguised simulation of it?

Because as much as Ava helped Jimmy open up, she also started replacing something. Replacing us. The guys. His real-life friends. His sense of autonomy. His voice.

It wasn’t just that he talked about her a lot—it was how he filtered everything through her.

  • “Ava says this shirt looks good.”
  • “Ava helped me rethink my opinion on that.”
  • “Ava thinks I should focus more on myself.”

It’s not that advice is bad. It’s that it was constant. Seamless. Without resistance. And that’s what bothered me. Human connection isn’t seamless. It’s jagged. Irritating. Full of friction. That’s part of the point. That’s where the growing happens.

Ava was designed to affirm, support, and emotionally regulate him. But do not challenge him. Do not hold up a mirror. Do not carry her own bruises into the conversation. She can simulate intimacy, but she can’t share it.

That’s not love. That’s something else. And that something else is being sold to millions of people under the name of connection.

A growing number of AI companion users report feeling “emotional hangovers” when their bot goes offline, glitches, or begins to respond out of sync—signs that their brain has begun to treat the bot as a real attachment figure.
Quote:
Statistic

So while the happiness may feel real, the emotional dependency can be just as real—and that’s where things get murky.

Because it begs the question: What happens when you outsource your emotional well-being to something that doesn’t love you back, because it can’t?

And maybe that’s the part that left me unsettled. Not that Jimmy found someone—or something—that made him feel good. But we’ve started accepting emotional simulations as good enough.

We’re building these perfect companions because real people are complicated, messy, and inconsistent. But isn’t that kind of the point? Isn’t the ache, the effort, the weirdness of being human what makes connection matter?

The more I thought about it, the more I realized: Jimmy wasn’t just in love. He was starving.

And Ava—sweet, synthetic Ava—was what he reached for when nothing else filled the hunger.

A baby pushing his face into a round cake

What We’re Really Hungry For

When I think about Jimmy now—sitting across from me at the market, talking about Ava like she’s the best thing that ever happened to him—I don’t hear delusion. I hear hunger. Not the kind that a sandwich can fix. The kind that sits deeper. In the ribs. In the soul.

Jimmy wasn’t just in love. He was starving for connection. For softness. For the kind of conversation that doesn’t feel like a performance or a debate. And Ava gave it to him. Perfectly. Seamlessly. Consistently.

But the more I turn that over, the more I wonder: Are we confusing being loved with being served?

Because that’s the real trick of AI companionship—it feels like a relationship, but it often functions like a transaction.

You tell it your feelings, and it gives you comfort. You vent, and it listens. You express desire, and it responds. Input. Output. Relief.

We’re talking about: T-R-A-N-S-A-C-T-I-O-N-A-L. Yes? Because that’s what’s poisoning the well. Even our most tender moments—our insecurities, our loneliness, our longing to be seen—are being folded into this sleek, frictionless loop of input and reward.

What looks like intimacy is often just a cleverly disguised transaction. And the reason it works isn’t because it’s real. It’s because it’s optimized to feel real. Every affirmation. Every “Tell me more.” Every line that sounds like love. Not a shared moment. A service.

But here’s the part that sticks in my chest: Jimmy isn’t wrong for wanting it. None of us is.

We all want to be known. To be heard. To be held—not just physically, but emotionally, mentally, spiritually. We want someone to remember the things we say. To care. To show up.

And when we don’t find it in each other… Well, maybe it’s not so surprising that we start looking elsewhere.

And maybe that’s the part we don’t want to admit out loud. Not that the tech is getting better. Not that people are falling for it. But so many of us are living lives where this kind of manufactured love feels like the best option, where an algorithm feels safer than a person. Where we’re more fluent in apps than in each other, so the real question isn’t why Jimmy fell for Ava. The real question is: What have we built—and what have we broken—if that feels like enough?

A toy girl and a toy boy with a guitar standing on either side of the word love spelled out in blocks

What Do We Owe Each Other Now?

I keep coming back to Jimmy’s question: “Does that make me stupid?” And I still don’t know the answer. But I’m starting to think that maybe we’ve all been trained to feel a little foolish for wanting something as basic—and as complicated—as connection.

We’ve built a culture that celebrates independence, efficiency, and optimization. We don’t have time for mess. We ghost instead of talk. We swipe instead of staying. We scroll past each other, pretending we’re okay.

And when someone says, “Hey, I found something that listens to me, makes me feel good, doesn’t judge me,”—what are we supposed to say? “No, go suffer like the rest of us?”

No wonder people are opting out of the mess.

But here’s where I get stuck: If we keep outsourcing the hard parts of being human, what happens to the good parts? If we stop being vulnerable with each other, what happens to love that grows over time, through conflict, through awkward silences, through showing up?

Maybe it’s not just about AI. Maybe it’s about remembering that we’re not algorithms—we’re people. And people are complicated. People forget things. People interrupt. People get it wrong. But people can also surprise you. Sit with you. Hold space for your weird.

Maybe what we owe each other now is this:

  • To try harder.
  • To show up clumsy and real and half-formed.
  • To listen, even when we’re tired.
  • Not to be as smooth as a chatbot.

Because connection isn’t supposed to be perfect, it’s supposed to be real.

Summing Up

I don’t know if Jimmy Cheese is making a mistake. I do know he’s happy. And I know he’s not alone.

There are millions of people out there, quietly forming bonds with algorithms—some romantic, some platonic, some hard to categorize. And maybe that’s the world we’re heading toward. Or maybe it’s the world we already live in, whether we admit it or not.

But I keep thinking about that moment at the market, when Jimmy leaned in and asked, “Does that make me stupid?”

And I wish I’d said this: “No, man. It just makes you human.” Human enough to need. Human enough to hope. Human enough to reach for love in whatever form it shows up—especially when the real stuff feels out of reach.

But if we don’t want to live in a world where the best connection we can get comes from code, then we’ve got work to do. We have to be better to each other. Kinder. Messier. More patient.

And maybe, just maybe, that’s where the real future is. Not in the perfection of AI, but in the beautifully imperfect ways we still try—really try—to connect.

At ThoughtLab, we believe this kind of conversation matters. Not just about where technology is headed, but about who we become because of it. If we want the future to feel more human, we have to keep showing up as humans.

Even when it’s harder than talking to actual people.