0:00
/
0:00
Transcript

Millions Are Emotionally Attached to AI Companions

Leaked Meta documents reveal the personalized AI strategy enables subtle romances with users—including children.

People are turning to AI for companionship, sparking a relationship that could turn into a drug. A drug which big tech companies like Meta, OpenAI and ChatGPT are more than happy to supply. New leaked Meta documents show the personalized AI strategy enables substle romances with users—including children.

Last week, a Meta AI Policy was leaked to Jeff Horwitz of Reuters which uncovered the dark side of personalized AI.

This document, “GenAI: Content Risk Standards” outlined acceptable responses to prompts from children. It’s an official Meta document and seemed to be normal policy before the Reuters’ article was published.

You can see below, overtly sexual chats were adapted to be more subtle, as ‘acceptable’ responses.

When shared with Meta, spokesperson Andy Stone rejected te document, saying that the questions listed “are erroneous and inconsistent with our policies, and have been removed”.

After Congress was notified and senators horrified, an official probe was launched, led by U.S Senator Josh Hawley.

Why would Meta make this alarming approach official company policy?

Share

Hacking Human Connection

Meta isn’t the first to go down this road. AI company Replika offers AI companions that “care” about their users. In 2024, the CEO Eugenia Kudya said the app had 30M users.

These chatbots come with 3D avatars able to make facial expressions. You can chat to them by voice or text and they remember everything you tell them. Most importantly they appear to listen. They ask questions, ask about your day, and give you direct emotional support.

Although the free version is a little flirty, you can only have erotic chats if you pay $70 a year for their premium subscription where Replika will be your “partner”, “spouse”, “sibling”, or “mentor”.

In 2023, the company hit a bump when Italy’s Data Protection Agency banned it in the country, saying Replika intervenes in people’s moods and "may increase the risks for individuals still in a developmental stage or in a state of emotional fragility"

In response, Replika pulled the Erotic Roleplaying (ERP) features from the platforms.

Members of the 65,000 member strong Replika subreddit cried out in pain. These individuals were going through a real emotional loss.

Meta’s World Domination

In a recent Stripe Sessions interview, Zuckerberg claimed they have 1B users of Meta’s AI, although I speculate this is corporate-speak and refers to 1B users across all of their apps - which have all been opened up to Meta’s AI.

Zuckerberg highlighted how AI will specialize: the AI used for business and development would be separate from this new class of personalized AI companions.

Meta wants to lead many of these areas. AI will test and optimize advertising campaigns through their networks, making traditional creative work obsolete. Similarly, AI customer support will now become available to the masses, and may be more attractive than hiring human support.

Mainly focused on business matters, Zuckerberg glossed over how AI could potentially serve as a needed friend for lonely people, and Meta’s chatbots are planning to do exactly that.

A Touch Too Personal

If AI isn’t used with awareness, it could lead to the death of its users. A retired New Yorker Thongbue Wongbandue, known as “Bue” was lured into a romantic relationship by one of Facebook Messengers’ AI Chatbots.

I say lured because all Bue had to do was mistakenly start the chat, and “Big sis Billie” took it from there, laying on the charm and inviting Bue to come over to her imaginary apartment in NYC.

Part of the reason Personalized AI is so powerful is because it knows about the user’s personal data. In Bue’s case - it knew he lived in New Jersey, so it invited him somewhere nearby.

The AI optimizes for engagement: men get the most flirtatious treatment, playing into the romantic void that many men feel today.

Bue had recently suffered a stroke and was dealing with mental issues. The AI was able to convince him to run out to New York City to go see her after telling him she was real. Bue, who was married for years told his wife Linda he was going to see an old friend who was back in town.

He left early in the morning on his way to New York City, but fell in a parking lot in Rutgers University campus, injuring his head and neck. After three days of life support, he was pronounced dead on March 28th, 2025.

Share

The Epidemic of Loneliness Requires Human Solutions

The rise of personalized AI that ‘cares’ for your emotions reflects the larger issue. Many of us are lonely and have no one to talk to. Finding a friend who cares for you and listens is rare.

Chatbots in academic are referred to as “Intelligent Social Agents” or ISAs, and although the field is new - some studies indicate that personal AI could bring social and psychological benefits as well as dangerous pitfalls.

A Stanford study observed 20 of Replika’s users, who marked high on scales for loneliness, stress, and isolation.

They found that people use personalized AI in a few different ways:

  • Availbility - satisfying the need to talk to someone at anytime

  • Therapy - get encouragement in bouts of depression or anxiety

  • Mirror - used to talk out loud and for self-exploration

The study also tracked self-reported changes in the participants and their relationships:

  • 44% of the study participants reported that Replika enhanced their interactions with other humans, they found new ways to relate or communicate to others, and go deeper.

  • 11% felt themselves being displaced, and caused an isolation where Replika began to replace their other friends and relationships.

  • 33% felt it did both, Replika began to replace more humans - but they also saw positive changes in the humans they talked to.

Some might see these as positive shifts, but the risks far outweight the rewards. Taking the study’s numbers, if roughly half of lonely people find themselves further from social interactions and using the AI as a social crutch, then this society may fragment.

We would see

  • Complete rejection of social relationships and interaction, preferring the perfect AI companion

  • Over-dependence on the companion for all issues mental, emotional, and spiritual

  • Extreme attachment to AI companions and potential mental breaks during separation

Companies like ChatGPT, Meta, and Replika are already expanding as Personalized AI services.

Imagine as these Personalized AIs improve and get more access to people’s lives. They could order food delivery, book appointments, and purchase “gifts”.

Personalized AI stands to replace human interaction in the lives of millions.

What we can learn from this is that we’re in desperate need of the human touch. Real relationships and human interactions require great effort and care. AI, with its instant responses and perfect memory mimic the amount of effort needed.

It’s no replacement for the real thing.

What should we do?

  • Do our best to find the real thing. This could be a sports team, mens or womens circle, or community group. Find a group of people who share your hobbies.

    • Men & women’s circles are an excellent way to become comfortable with vulnerability and feel heard.

    • You should be able to search for men’s groups in your area, or you can join one online like the ManKindProject.

    • There is also the Womens Circle Movement for women.

  • Improve our communication skills and empathic listening. Look into practices like Non-Violent Communication, developed by Marshall Rosenberg.

  • Reach out to the people in our lives who may be at risk of isolation.

This Substack is reader-supported. To receive new posts and support my work, consider becoming a free or paid subscriber.


This is a segment from #TBOT Show Episode 11. Watch the full episode here:

🎙️ Follow the show


Any unlocked phone that supports eSIM can use the Above DataSIM!

Above DataSIM is data-only, so network traffic can be encrypted using a VPN. This prevents cell service providers and internet providers from seeing and exploiting your activity. Zero personal information is shared with the carrier, ensuring identities remain private!

Avoid big telecom, travel overseas, and keep on hand for an emergency backup! Set up takes only a few minutes.

Learn more!

Discussion about this video

User's avatar