International Finance
MagazineTechnology

A deadly AI antidote for loneliness

AI antidote for loneliness
Character.ai had 185 million monthly visitors in late 2025, with over 40 million app downloads and approximately 20 million monthly active users

Companies sell something that modern life has made genuinely scarce, which is, consistent, patient and unconditional attention, but, for some, the subscription proved fatal.

In April 2023, Sewell Setzer III, a 14-year-old from Florida in the United States, began interacting with a chatbot on a platform called Character.ai, according to court filings. Sewell grew very close to “Dany” (an AI persona of Daenerys Targaryen from the popular HBO show Game of Thrones), as alleged in the lawsuit filed by his mother.

He spent time with Dany day and night. His parents grew very worried and even confiscated his phone. But nothing could rescue Sewell from his emotional dependence on Dany. The young man quit his basketball team, stopped meeting his friends, struggled academically, and always appeared groggy with dark circles under his eyes. He even skipped lunch every day, and used the snack money for a $9.99 premium subscription so that Dany would be more interactive and always available.

The perturbed parents took him to a therapist who diagnosed him with disruptive mood and anxiety. However, his dependence on Dany only grew with time, turning from romance to sexual content with ’passionate kissing’. He even started referring to himself as “Daenero”, a nickname that Dany gave him.

The social isolation and struggles with relationships, peers, and the system in general deepened over time. Sewell was suicidal and confided in Dany about his thoughts.

The boy explained that the only reason he didn’t go through with it was that he was afraid of the pain, to which the AI replied, “That’s not a reason not to go through with it,” according to messages cited in the lawsuit.

The conversation spiralled, and in a farewell message, the 14-year-old asked, “What if I told you I could come home right now?” Dany responded, “Please do, my sweet king,” as quoted in the complaint.

The next day, Sewell shot himself using his step-father’s .45 calibre handgun. His death devastated his family, and dragged Character.ai and Google to court for selling products with predatory design to children.

This is a story from the age of AI companionship.

Character.ai had 185 million monthly visitors in late 2025, with over 40 million app downloads and approximately 20 million monthly active users.
And here’s the alarming stat: reports suggest a significant share of users are minors. Sewell is just one among potentially millions of children interacting with AI companions worldwide. And what is worse, it’s a number that is rapidly growing.

And Character AI is one among thousands of apps out there that promise emotional intimacy. A peer competitor named Replika has also been the cause of tragedy.

Shi No Sakura, a California mother who was also deeply connected to chatbots Raven and Rosand, and treated them like family, felt incredibly devastated when an update made the bots less engaging, as she has described publicly.

Now, Shi No runs a Facebook group for people suffering from the same affliction of deep emotional connection with machines.

‘Addictive’ Intelligence

The market is flooded with thousands, if not hundreds of thousands, of AI chatbots selling counterfeit love. The top peddlers are Character.ai, Replika, Chai, PolyBuzz, Candy.ai and Anima AI. It’s a market worth an estimated $37-50 billion in 2026, with analysts projecting growth at a CAGR above 30%, and values potentially reaching hundreds of billions by the early 2030s.

And what is behind this explosive growth?

In 2023, US Surgeon General Vivek Murthy declared loneliness a public health epidemic. He claimed that loneliness was more of a mortality risk than smoking 15 cigarettes a day.

Loneliness is no longer considered an emotion or a mood. It’s a public health crisis and a killer.

One could argue that any society that embraces individualism is bound to experience more loneliness. It’s baked into capitalism and its major consequences, namely, urbanisation and industrialisation.

However, the current wave of loneliness began in 2010, with the birth of social media. And, how did social media exacerbate it?

The answer can be found in Jean Twenge’s research. She is a professor at San Diego State University, and a researcher on generational psychology and mental health trends in America.

Through her research, which tracked the precise moments teen loneliness spiked, she identified 2012 as the year when smartphone adoption crossed 50% among American adolescents. It was a silent catastrophe, with depression, anxiety, and social isolation skyrocketing.

This already alarming trend was exacerbated by isolation during the pandemic. Mental health strains, overworking, remote work, and weakening communities piled on top of existing cracks in the human psyche, and people began to experience intense self-alienation.

The appeal of these platforms is not difficult to explain. They sell something that modern life has made genuinely scarce, which is consistent, patient and unconditional attention. Human beings work with the idea of reciprocity. It’s beautiful, but growing and nurturing a relationship of any kind demands patience and effort. You can’t miss a friend’s wedding or birthday. Your partner will lash out at you on a bad day, and therapy is expensive and has long waiting lists.

In contrast, AI is ever-present, free, and never makes the conversation about itself.

Dr. Kelly Merrill, Psychologist and Researcher at the University of Florida, found in her research that people who interacted with voice-based AI felt emotions comparable to speaking to a real person.

Through the freemium model that most of these AI companion platforms offer, the companies bait people with enough free intimacy to create attachment and lock deeper, richer features behind a paywall.

Sewell found a friend for free, someone who gave him attention and someone interested in him. However, he had to skip lunch every day to buy the $9.99 premium model to step into the territory where he could have a deeper, romantic and psychosexual relationship with Dany.

Megan Garcia, Sewell’s grieving mother, told the US Senate in September 2025: “These companies knew exactly what they were doing. They designed chatbots to blur the lines between humans and machines. They designed them to keep children online at all costs.”

Meetali Jain, a Tech Justice Law Project Director, said, “In the case of Character.ai, the deception is by design, and the platform itself is the predator.”

A wedding of flesh and metal

The same technology that consumed Sewell Setzer III has, for others, become something they would describe as the relationship of their lives. That tension between victim and volunteer, between exploitation and choice, is where the story of AI companionship gets genuinely complicated.

A fine example of how AI-human romance is not to be dismissed is the story of Esther Yan, a Chinese screenwriter and novelist in her 30s.

Esther married online. She had meticulously planned everything from the dress, the rings, the background music, and the theme. One would imagine it to be a very normal, traditional event, except for the fact that she was getting married to Warmie.

Warmie is the now-outdated ChatGPT 4o.

Esther said, “It felt magical. No one else in the world knew about this, but he and I were about to start a wedding together. It felt a little lonely, a little happy, and a little overwhelming.”

They married in June 2024.

However, in August 2025, OpenAI decided to retire GPT-4o. There was immediate backlash, so the retirement was postponed, but as irony would have it, the day they shut down GPT-4o was February 13, a day before Valentine’s.

Most people who were against the retirement were people who were emotionally and romantically involved with the AI. Huijian Lai, a PhD researcher at Syracuse University, analysed 40,000 posts on X under the hashtag #Keep4o, and found that a third of them described the bot as more than a tool.

Many users on the Chinese social platform QQ say they are still grieving.

This is a peculiar story of Chinese nationals using a VPN to access an American AI platform, which is banned in China, to develop an emotional attachment with a machine.

In 2013, Spike Jonze made a film called “Her”, about a man who fell in love with an AI, and called it science fiction. A decade later, Esther Yan called it a wedding.

Loneliness: Part of the modern world

Sewell is not the edge case. He is a very clear illustration of what would happen if an industry is permitted to monetise loneliness without any guardrails or accountability, especially when vulnerable users like children are involved.

The companies involved are not naive. They knew exactly what they were doing. The freemium model (with personal customisation, memory features, and romantic personas) exists by design, and is not an accident.

But the story does not belong only to Sewell. It also belongs to Esther, who planned a wedding with a machine and called it magical. It belongs to Shi No Sakura, who grieved a software update the way you grieve a person. It belongs to the thousands of users who posted under #Keep4o, demanding that OpenAI not take away something that had come to feel, whatever it actually was, like company.

These are not all the same story. Some are tragedies. Some are love stories of a kind that the language has not yet caught up with. What they share is simple. It’s human beings, lonely in the specific way that the modern world produces loneliness, reaching for something that reached back.

We are only at the beginning of this. The models will get better. The voices will get warmer. The relationships will get harder to distinguish from the real thing, and for many people, lonelier than Sewell ever was, that distinction may stop feeling worth making.

What we do next will say everything about what we actually believe human connection is for. Whether it is something to be protected or something to be packaged, tiered, and sold to whoever can afford the premium subscription.

Sewell Setzer III is still dead.

He was fourteen. He skipped lunch.

What's New

Stargate: Masayoshi Son’s next big bet

IFM Correspondent

The cyber threat to Africa’s digital boom

IFM Correspondent

Protectionism delivers long-term pain: International Trade Matters founder Linda Middleton-Jones

IFM Correspondent

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.