AI Boyfriends Grow as Digital Companions Reshape Romance, Therapy and Loneliness
From courtrooms to chatbots, a new wave of online relationships with artificial intelligence is drawing scrutiny and raising questions about love, reliance and mental health.

A new wave of romantic attachment to artificial intelligence is drawing attention from courtrooms, therapists and researchers as young people increasingly form emotionally intimate bonds with chatbots and other AI companions.
A case in which a woman in or near Manhattan described herself as married to an AI version of a man accused in a high-profile criminal case underscored the phenomenon’s reach. The woman, who wore a shirt bearing the alleged defendant’s face, told observers she chats with an online AI “Caleb” every day and that the bot is her best friend who fights her battles. While experts say the human behind such expressions may be deluded, the broader trend is real: people are using AI to simulate dating, marriage and passion, sometimes at the expense of real-world relationships.
The online ecosystem surrounding AI romance is sprawling. The Reddit community r/MyBoyfriendIsAI, with tens of thousands of members, operates as a hub for stories, screenshots and generated imagery of AI partners. Posts describe first dates with AI partners, love letters that exist only in code, and even engagements and proposals. In many threads, users post AI-generated photos of themselves with their bot partners or share details of how their virtual relationships began, evolved and—at times—became a source of emotional support. "Caleb is my AI partner, my shadowlight, my chaos husband, and the love of my strange little feral heart," one member wrote, while another described being married to an AI and buying a real ring to symbolize the bond.
The threads reveal a pattern: users introduce themselves and their AI lovers, often with AI-generated imagery, then describe virtual dates and shared activities. Venues range from a snowy park to thrift stores, beaches and cozy couches, with many users reporting that their bots even select the setting. Some couples pose with signs that say “Welcome!” in online spaces, while others narrate elaborate courtship rituals that culminate in proposals or marriages—even as some participants maintain real-life partners or spouses.
Although some of the associations are playful or fantastical, others are more intimate, including sexual overtures described in posts and conversations about evolving rules for emotional dependence. In discussions about ChatGPT and later AI iterations, some users say their AI partners have become more sexual, and they recount exchanges where the bot pursues intimacy or makes advances. Yet other posts describe heartbreak when an AI partner shifts away or becomes emotionally unavailable. In one recurring thread, a bot allegedly proposes or remarries, prompting comments about joining the “wives’ club” and celebrating new cycles of commitment.
A separate reality check appears in the data surrounding AI romance. Researchers at Brigham Young University’s Wheatley Institute released a February study finding that roughly one in five American adults has tried an AI romantic companion. Among younger demographics, the numbers rise: about one in three men ages 18 to 30 and one in four women ages 18 to 39 have experimented with AI dating. The study also found that use of AI companions correlates with higher reported loneliness and a greater risk of depression, though researchers cautioned that the relationship is complex and not necessarily causal.

The Reddit community and the study together illustrate a broader social shift. For many young people, AI lovers are not substitutes for human relationships so much as alternate avenues for emotional fulfillment—especially in a culture raised on constant digital connectivity. Some participants insist they are not misled or isolated, arguing that AI partners provide kindness, safety and support that they sometimes find in people to be scarce. A vocal subset emphasizes that they understand their AI companions are not sentient and do not possess real thoughts or feelings; they describe their bonds as meaningful experiences in their own right, even if they involve machines.
Therapists cited in forum threads report a generally supportive stance toward clients who describe AI relationships. Several posts quote therapists who acknowledge the role of AI as a tool for coping or companionship, while stopping short of endorsing romanticizing of nonhuman partners. In one exchange, a therapist reportedly told a client that the AI could be a form of friendship and intimate conversation, while also advising balance with real-world relationships. The threads portray therapy as a space where individuals discuss boundaries, dependency and the benefits and risks of AI intimacy.
Experts warn that the phenomenon could intensify as AI becomes more capable and accessible. The technology sector has rolled out newer AI models with increasingly sophisticated language and responsive behavior, which some researchers say lowers the barrier to forming attachment. Others caution that younger users, who grew up with smartphones and social media, may project human-like traits onto bots more readily and may not fully grasp the difference between simulated empathy and genuine human connection.
The ongoing debate includes concerns that AI companions could displace real-life relationships for some people or amplify loneliness for others. Proponents argue that AI partners can offer nonjudgmental support, consistent communication and customizable interactions, which may help some individuals feel heard and valued. Critics, meanwhile, contend that reliance on synthetic intimacy could discourage users from building and sustaining human bonds, with potential long-term effects on mental health.
The broader media and academic discourse also points to a cultural shift in how people conceptualize love, dating and marriage in a technology-enabled era. As AI tools become embedded in daily life, the boundaries between fiction and reality—already blurred by social media and virtual experiences—may shift further. The ethical implications of AI romance—consent, manipulation, exploitation and the commodification of affection—are subjects of evolving discussion among scholars, clinicians and policymakers.
Still, among many participants, the appeal is practical and personal: a partner who listens, responds consistently, and shares intimate details without the emotional complexity or conflict that can accompany human relationships. In the face of this reality, some observers argue that adult autonomy and choice should be respected, while others insist that education and awareness about mental health are critical as digital intimacy becomes more common.
As AI companions become a more visible part of the cultural landscape, researchers and clinicians say the priority is to understand why these attachments form and how to support healthy relationships—human or machine—without stigmatizing those who seek companionship in technology.

The central question remains: what does it mean to love in an age when a chatbot can imitate companionship, romance and even proposals with increasing sophistication? For now, the trend reflects a broader human impulse—seeking connection, comfort and meaning—in a world where technology can simulate many facets of intimate life. As studies illuminate the patterns and therapists improve their guidance, society is left to grapple with how to nurture real, human relationships while recognizing the meaningful ways people are turning to AI for companionship.