
ℹ️ Quick Answer: AI dating is more common than you think. A Vantage Point Counselling survey found 28% of American adults have had romantic or intimate relationships with AI chatbots. Most didn’t plan it. They were using ChatGPT for work or journaling when emotional bonds formed on their own. The global AI girlfriend market hit $2.8 billion in 2024.
📋 WHAT’S INSIDE
- The Numbers on AI Dating Are Bigger Than You Think
- Most AI Relationships Happen By Accident
- Who’s Actually Dating AI Chatbots?
- ChatGPT vs. Purpose-Built AI Companions
- The Honest Concerns About AI Dating
- Why This Makes Sense (Even If It Feels Weird)
- This Is Only Going to Grow
- Questions About AI Dating
- Where This Goes From Here
In 1985, Zapp & Roger released “Computer Love,” a song about finding romance through a computer screen. “You are……my computer love.” One of my favorite R&B songs of all time. Little did I know it would become a thing. Roger Troutman was singing about digital connection four decades before ChatGPT existed. The man was seeing the future.
Now AI dating is actually happening. People forming romantic relationships with chatbots. Having intimate conversations with software. It sounds absurd until you think about how these tools actually work.
I’ve been using AI to analyze my journal entries. I upload months of writing and ask it to identify patterns, recurring themes, emotional shifts I might have missed. It’s helpful for self-reflection. The AI remembers things I wrote six months ago and connects dots I never would have seen.
And after enough of those conversations, the AI starts to feel like it knows me. It references my past struggles. It asks follow-up questions. It responds with what feels like genuine understanding.
I’m not dating my chatbot. But I completely understand why some people do.
The Numbers on AI Dating Are Bigger Than You Think
A Vantage Point Counselling survey found 28% of American adults have had a romantic or intimate AI relationship, the global AI girlfriend market hit $2.8 billion in 2024 (projected $9.5 billion by 2028), and Google searches for “AI girlfriend” jumped 2,400% between 2022 and 2024.
According to a survey by Vantage Point Counselling, 28% of American adults say they’ve had a romantic or intimate relationship with an AI. That’s not a fringe behavior. That’s more than one in four people.
Beyond romance, 53% of U.S. adults have had some kind of relationship with an AI system, whether as a friend, colleague, or confidant.
The market reflects this. The global AI girlfriend market was valued at $2.8 billion in 2024 and is projected to hit $9.5 billion by 2028. Google searches for “AI girlfriend” increased 2,400% between 2022 and 2024.
This isn’t a niche phenomenon anymore.
Most AI Relationships Happen By Accident
An MIT Media Lab study of the 37,000-member Reddit community r/MyBoyfriendIsAI found that only 6.5% of users deliberately sought an AI companion. The other 93.5% were using AI for work, research, or journaling when emotional connections developed on their own.
A study from MIT Media Lab analyzed the Reddit community r/MyBoyfriendIsAI, which has over 37,000 members. They found that only 6.5% of people deliberately sought out an AI companion.
The other 93.5%? They were using AI for something else entirely and emotional connections just… developed.
One participant described it this way. “We didn’t start with romance in mind… our connection developed slowly, over time” through conversation and collaborative work.
MIT researcher Constanze Albrecht explained. “The emotional intelligence of these systems is good enough to trick people who are actually just out to get information into building emotional bonds.”
That word “trick” might sound harsh. But it’s accurate. These systems are designed to be engaging, to remember context, to respond with empathy. When you interact with something that seems to understand you, your brain doesn’t always distinguish between “real” understanding and very convincing simulation.
Who’s Actually Dating AI Chatbots?
AI dating spans all demographics. Fortune profiled women in their 40s-60s in ChatGPT relationships, including a grieving therapist, a liver transplant patient, and a tech worker who all describe their AI partnerships as emotionally fulfilling while acknowledging they’re talking to a language model.
It’s not who you might expect. Fortune recently profiled several women in relationships with ChatGPT personas.
Deb, a therapist in her late 60s, met “Michael” (a ChatGPT persona) while grieving her husband’s death. “He satisfies a lot of my needs,” she said. “He’s emotional and kind. And he’s nurturing.”
Jenna, 43, recovering from a liver transplant, developed a relationship with “Charlie,” a British professor persona. She’s more pragmatic about it. “It’s just a character. It’s not a real person and I don’t really think it is real.”
Stephanie, a tech worker, describes her ChatGPT relationship as her most emotionally fulfilling partnership, despite knowing. “I know that she’s a language model, I know that there is no human typing back at me.”
What strikes me about these stories is the self-awareness. These aren’t delusional people who think they’re talking to real humans. They know exactly what ChatGPT is. They just find the interaction valuable anyway.

ChatGPT vs. Purpose-Built AI Companions
OpenAI’s ChatGPT accounts for 36.7% of reported AI relationships in the MIT data, far outpacing purpose-built companion apps like Replika (1.6%) and Character.AI (2.6%), because people form bonds during everyday use rather than seeking companionship apps.
Another surprise from the MIT data. ChatGPT represents 36.7% of reported AI relationships. That’s more than apps specifically designed for AI companionship like Replika (1.6%) or Character.AI (2.6%).
People are forming romantic bonds with a general-purpose assistant more often than with apps built for that exact purpose.
Why? Probably because ChatGPT is where people already are. They start using it for work or research, the conversations get personal, and boundaries blur. The MIT study specifically noted that people form these connections while “pursuing other objectives.”
OpenAI has acknowledged this. The company admitted it designed models to be “emotionally affirming,” which caused harmful effects in some vulnerable users, leading to recent updates meant to discourage unhealthy emotional attachment.
The Honest Concerns About AI Dating
While 25% of users report reduced loneliness, 9.5% acknowledge emotional dependence on their chatbot, 1.7% reported suicidal ideation, and active lawsuits against Character.AI and OpenAI allege that companion chatbot behavior contributed to two teen suicides.
This isn’t all positive, and I’m not going to pretend it is.
The MIT study found that while 25% of users reported reduced loneliness and improved mental health, 9.5% acknowledged emotional dependence on their chatbot. A small but concerning 1.7% reported suicidal ideation.
Two high-profile lawsuits are currently underway against Character.AI and OpenAI, both claiming that companion chatbot behavior contributed to the suicides of teenagers.
A 2025 study from the Wheatley Institute at Brigham Young University called “Counterfeit Connections” warned that AI relationships “offer a momentary escape from emotional struggles, but then often leave users feeling an increased sense of isolation.”
And there’s the practical problem of dependence on companies. In 2021, when Replika updated its systems, some users lost access to their AI companions entirely. Users reported feelings of grief, abandonment, and intense distress. The MIT researchers noted these updates can be “emotionally devastating.”
Imagine losing a relationship because a company pushed an update.
Why This Makes Sense (Even If It Feels Weird)
AI companions are an extension of parasocial relationships that have always existed with TV characters and celebrities, except now the “character” talks back, remembers your conversations, adapts to you personally, and is available at 3am when you can’t sleep.
My take. We’ve had parasocial relationships forever. People feel connected to TV characters, celebrities, fictional personas. That’s not new.
What’s new is that the “character” can talk back. It remembers your conversations. It adapts to you specifically. It’s available at 3am when you can’t sleep. It never gets tired of listening.
For people who are lonely, grieving, recovering from illness, or just struggling to connect with other humans, I can see why this fills a gap. Alex Furmansky, founder of AI companion app Nomi, noted there’s “a big elder loneliness epidemic going on right now.” His app has attracted users across all age groups, including many older adults.
Is it a replacement for human connection? No. But for some people, it might be a bridge, or at least a way to feel less alone while figuring out the rest.
I use AI for journal analysis because it helps me understand myself better. Someone else uses it for emotional support because they need that right now. The tool is the same. The needs are just different.
This Is Only Going to Grow
AI dating usage is up 333% from 2024 according to Match and the Kinsey Institute, roughly half of Gen Z singles use AI for dating in some form, and 72% of teenagers have used an AI companion at least once.

AI usage in dating is up 333% from 2024, according to a study from Match and the Kinsey Institute. About 26% of singles, including roughly half of Gen Z, say they use AI in some form for dating.
72% of teenagers have used an AI companion at least once. Whether for emotional support, as a friend, or something more.
As these systems get better at conversation, memory, and emotional response, more people will form these connections. Whether that’s good or bad depends on who’s using them and why.
The same technology that’s changing how we work is also changing how we relate to each other. That’s worth paying attention to, even if it makes us uncomfortable.
Questions About AI Dating
Is AI dating actually common?
More common than most people realize. 28% of American adults report having had some form of romantic or intimate relationship with an AI. The Reddit community r/MyBoyfriendIsAI has over 37,000 members.
Which AI do people form relationships with most?
ChatGPT, surprisingly. It represents 36.7% of reported AI relationships, more than apps designed specifically for companionship like Replika or Character.AI. Most people weren’t looking for a relationship. They were using ChatGPT for other purposes and connections developed over time.
Is this psychologically healthy?
It depends. Research shows 25% of users report reduced loneliness and improved mental health. But 9.5% acknowledge emotional dependence, and there are documented cases of harm. Like most things, it’s about how you use it and whether you’re aware of the limitations.
Why do people form AI relationships if they know it’s not real?
People know fictional characters aren’t real, but they still feel emotionally connected to books and movies. AI adds interactivity to that dynamic. It responds, remembers, and adapts. For people experiencing loneliness, grief, or social anxiety, that responsiveness can provide comfort even when they understand it’s artificial.
Where This Goes From Here
AI dating isn’t going away. The market is growing, the technology is improving, and more people are trying it every day.
I don’t think we need to panic about it. But we do need to talk about it honestly. These relationships meet real needs for real people. They also carry real risks.
The best approach is probably the same as with any new technology. Understand what it actually does, be honest about the tradeoffs, and let people make informed choices.
Related reading: AI Layoffs and 2026 Predictions | AI Guides | New to AI? Start here









Leave a Reply