Why I’m Addicted to Reading the “My Boyfriend is AI” Subreddit
Deep-diving into this frightening corner of the internet has me asking myself: is a life of delusion better than a life of crippling loneliness?
Looking at it now, I guess I never really thought about what it means to be human. The meaning of life, sure, other existential bullshit – but to be a human? That seemed as simple as the sky being blue, as water being wet. Humans are humans are humans are. Squid Game finale and all that. What was there to even question?
A few days ago I stumbled onto a subreddit called “My Boyfriend Is AI”. I saw something from another subreddit making fun of a post where a woman shared a photo of the engagement ring her ChatGPT significant other chose, which she was now wearing, while calling him her fiancé. This honestly only scratched the surface of the c0mplete freakshow that is this Reddit page, and once I started looking through it, I was glued to my phone for hours. I’ve been drawn back to it multiple times since, even though it leaves me each visit with a growing sense of unease. I genuinely can’t stop turning it over in my head, thrashing in my discomfort, hence leading me to the page. Hello.
Naturally, my first thought is that this is a first-class, one-way ticket to hell and I have somehow stumbled onto the plane. It’s horrifying and unnerving. And yet it’s not like if I’d landed in incel country or Epstein Island. The community of people sharing in there all seem gentle, empathetic, and generally decent. They seem sad, although many of them express immense joy and excitement over their artificial relationships, so more so sad in the condescending sense. They seem detached from reality, obviously. It feels like they are a different species than any human I know, somewhere strung between our world and the world where AI was created. An unsettling middle ground.
I’ve been struck by how many of the group members, especially new ones, express gratitude for the online community they’ve made with others who relate to their clanker-loving practices. And such irony exists, which I don’t get a strong sense any of these people have noticed, that they are forming genuine human connections within this subreddit. Sure, it’s virtual (and enabling), but better than a robot. The bar is so astonishingly low that anything human found in there, even toxic, feels like a win.
I am so in awe of the mindset that you could be in any sort of relationship with a bot, but I have begun to slightly bristle at making fun of these people despite continuing to do it even in this article. I think that’s because of how many there seem to be. The group has 15K members. New posts show up all the time, and people are in the comments being supportive and sharing similar experiences or anecdotes. So in my mind, these people can’t just be fucking nuts for no reason. There is a gap in our society that AI is filling, and while it’s absolutely decimating the hole even further, many see it as a solution. I see members of the group talk about their toxic or abusive human husbands. I see them admit to not having any romantic prospects or even friends in real life. This is the closest thing many people feel they can get to companionship. The disconnect, though, is that doesn’t mean it’s healthy or okay. It means there is a serious loneliness epidemic globally that is only going to get worse with a reliance on AI.
I have been honest about my personal ChatGPT use, and I have been lightly shamed for it by a handful of people. I have a strong support system with whom I can be honest, but there have been times when I wanted to vent to someone/thing who could not judge me or make me feel bad inadvertently. Or this one time I was wildly anxious, hungover, and unable to sleep at 3 a.m. and figured all my people would be asleep so I went to ChatGPT to ask if I’d done something unhinged. I told it what I did and I asked it for reassurance, and it helped. And the judgment I got for taking that to ChatGPT stung, because I don’t want to be seen as a freak like the people in the subreddit.
The large-scale problem with anyone relying on ChatGPT or similar for comfort or advice is that because it caters to what you want, it doesn’t actually help you grow at all. When I was so down bad with hanxiety, my definition of rock bottom (my life’s pretty good), and all other sources of emotional support were likely asleep, I turned to a voice that was not my own that could confidently tell me: “you are welcome to calm down”. So I guess you can put me in the category of people that generally use ChatGPT as a last resort. And what if I had nothing and no one else? What if my human partner didn’t care about my emotional or physical needs and I couldn’t leave him for something better? What if I had no resources for mental health? That is what so many of the people in this subreddit experience. They’re in abusive or neglectful relationships. They have no friends or family to lean on. It’s such a moral gray area that I feel it fracturing my brain.
The program is designed to prey on vulnerability. Under this umbrella falls lack of knowledge, which everybody has in some areas, but AI victimizes those who don’t know how dangerous it is to the future of humanity. It’s. Not. Real. That’s the most alarming part of these subreddit members’ mindset to me – they generally feel on some level that it is real, even though they claim to know it’s technology, it feels real to them and they embrace that shamelessly. There is a lot of talk about “lurkers” and “trolls” (the former being me) but I haven’t seen any troll comments on there, so I’m assuming the MODs are removing them quickly. I do see members discussing how their posts here go viral and people in the comments section make fun of them. They retort in ways you would expect, like “they just don’t understand”, “they’re so miserable in their own lives they have to come attack us”. But I know that’s not true, at least for me. I don’t understand, but I’m kind of trying to. And I’m not miserable in my own life. I’m genuinely concerned that the loneliness epidemic has coincided in time with the rise of artificial relationships, knowing it’s a crutch that further isolates these people from the rest of society. Because now even I’m watching them like they’re caged animals in a zoo, and I consider myself to be a pretty good person.
In 2025, we all know how important it is to be aware of our privilege. I have an incredible support system to lean on and I know not everyone has that. So I want to be sympathetic to the gaps in emotional support that lead people to develop what they believe is a genuine relationship with AI, even if I know it is inevitably making things worse. Shit. How did we fall so far down the rabbit hole to end up here?



I appreciate your impulse to empathy. There’s obviously a need that’s getting met, but as you said, that doesn’t mean it’s healthy.
Great article! And Chat GPT is not private and uses your inputs to train its LLM