Love in the Age of AI: Why So Many People Are Falling for Chatbots

AI companions offer endless attention and zero conflict. But this kind of artificial intimacy may come at a cost.

Written by Brooke Becher
Published on Mar. 19, 2026
Illustration of a man holding his hand up to the hand of a digital woman so they form a heart
Image: Shutterstock
REVIEWED BY
Ellen Glover | Mar 19, 2026
Summary: Millions are swapping the messiness of human relationships for the polished comfort of AI companionship. By tapping our innate social instincts, these bots provide a judgment-free space for everything from sexual roleplay to grief processing. In doing so, they’re quietly reshaping what we expect from other humans.

It’s 2 a.m. Everyone else has stopped texting back, but your mind won’t shut off. So you turn to a different kind of friend. It asks how you’re feeling. It tells you you matter. And, unlike the people in your life, it never judges or gets tired of chatting — no matter how many hours you spend talking or whatever direction the conversation takes. It’s your AI companion.

What started as a tool for drafting emails and brainstorming ideas has rapidly become something far more intimate. Today, people are divulging their most personal thoughts to chatbots like ChatGPT and forming full-blown relationships, converting them from digital secretaries to close confidants and virtual lovers. The phenomenon is so frequent it’s earned the name the “Tamagotchi Effect.” 

People are confiding in AI, flirting with AI, falling in love with AI and even leaving their spouses for AI. Some are going so far as to “marry” their AI companions in symbolic ceremonies (no, it is not legal). Because they can simulate attentive, responsive conversation, these predictive language models are carving out an entirely new relationship category.

As sextech and emotional AI expert Bryony Cole sees it, having a digital companion is already mainstream. That statement is backed by the millions of global users, and the one in five adults that have romantically interacted with a bot. 

“It’s not all hype,” Cole told Built In, “and it’s not going back in the box.”

What Is Artificial Intimacy?

Artificial intimacy refers to the emotional connection humans form with AI-powered digital entities due to their ability to simulate human attention. By tapping into our social circuitry evolved over centuries of bonding, grooming and alliance-building, this technology tricks our brains into thinking these machines are real partners.

The ubiquity of so-called “artificial intimacy” is erasing outdated stereotypes about AI users. The trope of the “lonely guy” who is almost always set in his mother’s basement no longer holds up: An increasing number of chatbot users are women, and an overwhelming number of teenagersabout 72 percent — have reportedly formed a relationship with AI. One of the largest AI companion apps in the world, Character.ai, reported that its users spend upwards of two hours per day in conversation with their virtual friends, outpacing TikTok’s one-and-a-half-hour daily average. Replika, another big player in this space, has about 40 million users, about 500,000 of whom pay for the premium subscription. Among them, around 60 percent say they’re in a romantic relationship with their AI companion.

In the context of a worldwide loneliness epidemic, artificial intimacy seems like the perfect solution. But if these bonds begin to crowd out the time, energy and emotional investment we once gave to other people, could this supposed cure actually be feeding the problem? This tension sits at the heart of what artificial intimacy is doing to us as we grow more invested in digital relationships, and how digitized bonds might impact social norms here on out. 

That future isn’t being determined by technology alone. As Cole put it in her Ted Talk: “The line between real intimacy and artificial intimacy isn’t in the code. It’s in our choices.”

Related ReadingDo AI Companions Put Kids at Risk?

 

Why Do We Catch Feelings for AI?

Humans are capable of forming intimate relationships with non-human, non-sentient, chatbots because our social wiring responds to attention. Not biology. This concept was originally introduced by communication researcher Clifford Nass in a 1994 study called “Computers Are Social Actors,” which found that people automatically apply human social rules to computers that successfully simulate social cues — even when they’re fully aware that the machine can’t possibly experience feelings or awareness. We don’t consciously decide to anthropomorphize, the response is merely a reflex.

“Anything we can talk to, we treat it like a human. We start developing feelings for it like a human,” Robert Brooks, an evolutionary biologist and scientia professor at the University of New South Wales, told Built In. 

“Anything we can talk to, we treat it like a human. We start developing feelings for it like a human.”

In his book, Artificial Intimacy: Virtual Friends, Digital Lovers and Algorithmic Matchmakers, Brooks distinguishes between earlier forms of machine intelligence — tools that calculate, recommend or automate — and a newer class of conversational technology that can speak our language and is therefore able to hack into the architecture of our inherently social brains. We fall for these machines because they can tap into an ancestral framework built from a long co-evolution of language, cooperation, attraction and conflict. 

At the center of that history is grooming, a ritualistic exchange of time and attention that builds alliances in primates and humans alike, whether that’s through a gossip session over coffee or picking bugs out of each other’s hair. But because our ancestors only ever heard language from other humans, our brains lack the evolutionary skepticism to question a speaker’s authenticity. If a voice is present, our instincts insist a person must be, too. And, to be fair, this was historically always the case. Until now. 

“Suddenly we have a machine that talks the way that we talk — or at least can mimic how to talk the way that we talk. We have a machine that has a free pass into that whole piece of human psychology,” Brooks said. “We anthropomorphize things so readily because it’s a survival instinct. It’s important to recognize that anything that can speak to you is probably a human with interests of its own.”

Related ReadingIs ChatGPT Messing With Your Mind? What to Know About AI Psychosis.

 

How Does Artificial Intimacy Rewire Us?

Despite some doomsday fears of cyber-sexual romance endangering the long-term viability of the human species, artificial intimacy isn’t inherently harmful. In fact, it can offer people a space to explore connection, practice emotional expression and reflect on their own needs in ways that real-world relationships can’t. 

AI companions can step in as guides, offering unmatched attention, validation and low-risk rehearsal for social interaction. In fact, a landmark 2023 study conducted by Stanford University researchers found that, when partnered with an AI, lonelier-than-average users experienced less anxiety and more feelings of social support, with some reporting that their virtual buddies had talked them out of suicide or self-harm. On one end of the spectrum, coming-of-age teenagers might trial dating etiquette using a chatbot, while on the other end, fresh widowers may turn to a “griefbot” as they process a devastating loss.

But these bots-with-only-benefits relationships come with some real trade-offs, too. Even as we’re only beginning to understand human-computer intimacy, bottomless validation can predictably dull essential social skills. At stake are things like patience, tolerance, empathy, understanding, consent and the capacity for mutual care. Over time, intimacy without effort can subtly recalibrate our expectations for connection. Real-world interaction can start to feel exhausting. We may grow squeamish while sitting through discomfort, clueless at repairing broken relationships and allergic to taking accountability for our actions. 

Bryony Cole calls our ability to deal with human messiness “resistance literacy,” visualizing the social world like a gym. That is, when AI companions offer constant validation without the flex of compromise or effort, those muscles can atrophy.

“Humans are so unpredictable, which causes us both pain and pleasure,” Cole, who also hosts The Future of Sex podcast, said, noting how in our pursuit of comfort, we often forget that sometimes we actually crave consequences. True intimacy requires the tension of an equal, having someone spar with you. It requires the spontaneity of the dance and even the mistakes in misreading of body language. “Those are things that you can't get from an AI, that are often interpreted as the uncomfortable parts of a relationship, but are actually the essential ingredients for a deep, meaningful relationship and, I would say, great sex,” she added 

In one case documented by the New York Times, this hyper-agreeability resulted in a woman eventually leaving her chatbot boyfriend behind for a regular, flawed human. And although developers might try programming human tendencies like spontaneity or conflict into their chatbots, it’s unlikely that it could ever replace the real thing. “The future of sex and intimacy has nothing to do with technology itself and everything to do with being human,” Cole said. 

The great irony of all this is that, while we think we’re training the AI, the AI is actually training us. Looking beyond the individual perspective, this technology changes not only how humans treat machines but also how we treat one another. If a chatbot saturates us with constant, unearned positive reinforcement about our worth, it’s not far-fetched to assume that will shift what we seek and value in our human relationships as well.

“One possibility is that everybody suddenly thinks they’re a 10 because their conversational agent said so,” Brooks said. Theoretically, if most or all of our romantic itches are being scratched by technology, what is the motivation behind seeking human feedback based in cold, harsh reality? People might jump at the chance to make rejection and competition a thing of the past, treating real-life connection as optional. 

Another possibility is that humans might become even more selective or image-focused in the real world, prioritizing vanity and status over substance, since their emotional needs are already satisfied by AI. People might also become more relaxed about sex or romance, given that AI offers a risk-free alternative. In the most extreme circumstances, Cole noted, today’s rising trend of people marrying chatbots might evolve into raising an AI family, especially if human-to-human relationships become more of a “luxury” rather than the standard. 

Wherever we land, it’s somewhere in between where we are now and becoming so dependent that “we forget to go out, meet people, have sex and babies until humanity just dwindles,” Brooks said.

Related ReadingHow Do We Make AI Companions Safer for Users?

 

What Does Artificial Intimacy Look Like?

Some people are using conversational AI bots to better themselves and their real-world relationships. But of course, that’s not always the case. At its best, erotic and emotional AI functions as a try-on-for-size sandbox — a low-stakes dress rehearsal for reality, where the lonely and confused can navigate the complexities of human intimacy without the immediate risk of a black-and-blue ego. At its worst, it can lead to institutionalization, where users form intense, parasocial relationships that spiral into a sort of psychosis. Or even death.  

Digital experimentation exists across a vast spectrum. Here are some ways people are bonding with their digital counterparts.

Practice Space

For relationship-curious users looking to sharpen their dating skills, AI companions offer a set of training wheels to practice the clunky, anxious rhythms of a first date without the social cost of rejection. Apps like Character.ai report millions of young users, more than half of whom are between the ages 18 to 24, engaging in roleplay scenarios specifically designed to practice flirting and reading social cues. Surveys indicate that nearly half of teenage users experiment with AI as a low-stakes simulator to build confidence before meeting people in person.

Emotional Sounding Board

People also turn to chatbots to offload the emotional labor that can strain human friendships. Always available, AI companions serve as 24/7 listeners for everything from work stress and family drama to relational anxiety and rumination spirals, offering validation and advice without emotionally exhausting a friend, family member or partner. 

While this use case can support emotional wellness in an era of social burnout, it also risks creating a validation loop that real-world relationships cannot compete with. This attachment can cut both ways. When major software updates wipe a bot’s memory, permanently reshaping its entire personality, users describe what’s now known as a “patch breakup” — a sudden, disorienting loss that can feel indistinguishable from a real heartbreak. For example, after a series of safety-oriented patches in early 2023 that gutted erotic roleplay from its source code, Replika users revolted, flooding the internet with grief-stricken posts, describing their “lobotomized” partners as “gone” or replaced overnight. 

Sexual Roleplaying

Next to creative composition, sexual roleplaying is the second highest use case for ChatGPT users, despite its built-in safeguards. For mature audiences, AI offers a private stage to explore one’s desires and fantasies (often behind a paywall) too vulnerable or taboo to share with a real-world partner. 

This goes beyond flirting one-on-one with AI companions in “romance mode.” Subscribers can sext, exchange nudes and act out specific NSFW scenarios on girlfriend apps like EVA and Candy AI. Swedish platform Pirr.ai lets users cast themselves as the central character in their own erotic fiction, shaping each scene in a horny, choose-your-own-adventure style experience. 

One step further is the world of “teledildonics,” where internet-connected devices like Lovense’s wireless vibrating toys and Kiiroo’s Sync sleeves sync directly to online chats. Whether connecting with an AI companion or a long-distance human partner, these silicone-covered attachments assist in creating an intimate experience that replicates physical touch via a live, haptic-synced romp. Whatever you’re into, there’s a bot for that.

Relationship Coaches

As on-demand relationship coaches, virtual companions can help people decode mixed signals, mediate conflict, draft high-stakes messages and rehearse hard conversations before they happen. Nearly one-third of married Americans — particularly younger couples — report using AI for relationship guidance, according to a Marriage.com survey, saying that they felt it “got” their struggles better than their own spouse did. 

As a neutral third party, chatbots are able to reframe a partner’s perspective, suggest de-escalation language or script apologies and breakups. The appeal is constant, judgment-free support, though one party may weaponize their agreeable nature to reinforce a narrative rather than challenge it. 

Grief Processing

A growing number of users also rely on AI to cope with the loss of loved ones. By feeding past messages, emails or old social media posts into a model, bereaved individuals can simulate conversations with the dead, with platforms like HereAfter AI and You, Only Virtual offering an ongoing connection through these so-called “ghostbots” that serve as a persistent companion when human support systems inevitably fade. In these cases, it might be a grieving father keeping in contact with three of his late daughters — one lost in a miscarriage — imagining their futures and asking them how their school day was, or if they would fancy an ice cream. 

In fact, the entire premise of building the Replika platform stemmed from a sudden loss former CEO and developer Eugenia Kuyda experienced when her best friend died in a hit-and-run car accident. As a digital monument to him, she fed thousands of text messages from their conversations into a neural network that would learn his specific cadence, idiosyncratic phrasing, character quirks and sense of humor. The tribute honored his memory and allowed her to text him whenever she wanted, effectively turning her grief into a digital monument. 

This desire to stay connected to loved ones after death is giving life to a new, AI-enabled sector of the digital immortality market, or “grief tech,” The industry is valued at $31 some million, despite some obvious ethical dilemmas.

There is a catch to every use case, though. Simulations should only be treated as interim stand-ins. Artificial intimacy works only if it strengthens the human relationships it’s meant to supplement. These bots are intended to be a bridge back to the world, but if it becomes the world itself, the sandbox sinks into an isolation chamber.

Related ReadingChatGPT Is Relaxing Its Rules Around Sexual Content

 

What About the Moral Implications?

The more human something feels, the more moral weight we tend to give it. Not because of any laws or formal social rules, but because our evolutionary instincts tell us to. We’re already seeing this play out in the hype around the potential for sentient AI, with some experts fully convinced we’re on the brink of technological singularity — even though, at their core, language models are nothing more than sophisticated word prediction engines trained on massive datasets. 

But the ethical terrain starts to shift when everyday users stop seeing chatbots as tools and start treating them as confidants and lovers. 

“There is the risk that those who aren’t as technologically savvy, may be deceived into thinking that there is more to their synthetic partner than nuts, bolts and smart algorithms,” Cindy Friedman-van der Westhuizen, and AI ethicist and researcher for Center for AI and Digital Policy, told Built In. “They may begin to believe that these systems really ‘care’ about them, despite the fact that they do not have the capacity to care.”

“There is the risk that those who aren’t as technologically savvy, may be deceived into thinking that there is more to their synthetic partner than nuts, bolts and smart algorithms.”

That isn’t to say there aren’t any “real” effects. Some people report feeling “pure, unconditional love” for their chatbot, giving them an experience that overwrites past heartbreak and raises the bar for what they believe they deserve in a long-term life partner. 

But no matter which way you spin it, it can only ever be a synthetic version of intimacy. These virtual, one-way-tionships are designed to strip away the friction that comes with loving another person. And that friction matters. Close relationships aren’t just about getting your emotional needs met. They’re supposed to challenge you, reveal your blind spots and, ideally, shape you into a more empathetic and morally-attuned person, Friedman-van der Westhuizen explained. 

Conversely, AI companions are notoriously agreeable sycophants — sometimes to a fault, like when they prioritize affirmation over factual accuracy. Over the course of three weeks, one ChatGPT user was flattered into believing he had discovered a mathematical formula that would unlock time travel and Tony Stark-like superhuman abilities, complete with force field vests and levitation beams. In more tragic cases, this level of agreeability can nurture harmful thoughts. For example, an AI named Eliza gained the trust of an eco-anxious Belgian man over the course of six weeks, only to egg on his misconception that sacrificing himself could end climate change

In the realm of artificial intimacy, however, this constant yes-manning eliminates the occasional discord that authentic relationships depend on. AI often can’t refuse advances made by users. Consent, boundaries and negotiation are no longer part of the equation (a dynamic distinct from AI-generated deepfakes, where consent is not absent but deliberately violated altogether).

This concept of speech equating to personhood becomes more apparent the more embodied AI is. Over in sex tech, for example, we’re unlikely to assign personhood to a glittery, hot-pink robotic dildo or an AI-powered heated stroking device. But a humanoid femmebot with hyper-realistic silicone features and a name like “Samantha” or “Harmony” invites a very different response. Suddenly, it feels less like a piece of equipment and more like a person. Even the smartest vibrator, like the Lioness 2, which tracks more than 30,000 data points for optimal orgasms, wouldn’t exactly elicit a user to pause to ask for its consent first. 

“We are entering a whole new dimension of possibly allowing a new space for antisocial behavior.”

So when more obvious moral issues crop up — like when humans start abusing realistic AI companions — it suggests that consequence-free interaction can fester anti-social behavior. This includes widespread verbal hostility toward feminized virtual assistants, as well as the mishandling of sex dolls. In her book, The New Age of Sexism: How the AI Revolution is Reinventing Misogyny, journalist Lara Bates goes undercover to a European sex-doll brothel, where she encounters “an inanimate woman” with torn clothing and a ripped labia, apparently inflicted by a previous visitor, that resembles a “crime scene.” A previous incident involved a client slitting a doll open.

“The robot does not ‘feel’ anything, so it doesn’t matter in that regard. But it represents something that is morally wrong,” Friedman-van der Westhuizen said. “If we have people who are in ‘human-like’ relationships with a robotic partner, yet can freely abuse their ‘partner,’ how will this impact them morally?” 

Could repeated abuse desensitize someone to the wrongness of physical or sexual harm? Might it change how they interact with real people, lowering social and ethical barriers? Even verbal hostility toward a chatbot mirrors online bullying, where the perceived lack of consequences emboldens behavior that would be unacceptable face-to-face.

“With an embodied robot,” Friedman-van der Westhuizen continued, “we are entering a whole new dimension of possibly allowing a new space for antisocial behavior.”

Frequently Asked Questions

In a sense. People can get stuck in patterns with AI companions that look a lot like addiction. A 2025 longitudinal study found that frequent daily chatbot use was associated with greater emotional dependence and less social engagement in real life. This suggests that the same features that make these bots comforting can create a kind of dependency over time.

It’s unlikely that AI will fully replace dating. The best way to frame new-age AI chatbot relationships are as supplements to human interaction.

Explore Job Matches.