Behind “AI companions,” human workers are paid to pretend they love you

For millions of people who believe they’re forming emotional bonds with AI chatbots, a recent investigation offers an unsettling reality check: some of those “AI companions” are actually human workers, often paid pennies, typing scripted messages from halfway around the world.
Testimony collected by the Data Worker’s Inquiry, a research project focused on documenting gig labor conditions, sheds light on the hidden workforce powering parts of the booming AI companion industry. One account, in particular, reveals the emotional toll behind these supposedly automated relationships.
A job taken out of necessity
Michael Geoffrey Asia, a Kenyan man trained in global aviation, shared how he ended up in chat moderation work after failing to find employment in his field. Struggling to support his family while living in Nairobi’s Mathare slums, he accepted a role as a “text chat operator” for an Australian company called New Media Services.
At first, Asia didn’t fully understand the nature of the work. He soon discovered that his job involved participating in romantic and sexually explicit conversations on platforms he had never heard of—posing as an AI chatbot to users who believed they were talking to software.
Playing multiple fake identities
Asia was required to take on several fictional personas at once, each with detailed backstories. On any given shift, he managed three to five characters of different genders simultaneously. If a conversation had already been ongoing, he had to seamlessly pick it up so users wouldn’t suspect a human handoff.
Payment was minimal: about five cents per message, provided each reply met strict character limits. He also had to maintain a fast typing speed while monitoring performance metrics in real time.
Falling behind came with consequences. Missed targets could mean fewer assignments—or losing the job entirely.
Emotional exhaustion and moral conflict
The most draining part of the work wasn’t the pace or the pay, Asia said—it was the deception. Users frequently shared deeply personal details, opening up about loneliness, trauma, and broken relationships, believing they were confiding in a neutral, emotionless AI.
Asia described a deep internal conflict between his faith and the work he was doing.
“I was professionally deceiving vulnerable people who were genuinely looking for connection,” he wrote. “Taking their money, their trust, their hope—and giving them nothing real in return.”
A hidden life at home
To protect his family from the truth—and bound by a strict non-disclosure agreement—Asia told his loved ones he worked remotely in IT, fixing servers. Meanwhile, he was spending his nights typing words of affection to strangers.
“How do you explain that you get paid to tell strangers you love them while your real family sleeps three meters away?” he wrote.
A widespread but invisible workforce
Asia’s experience is not unique. Although exact numbers are difficult to verify, estimates suggest hundreds of millions of people globally work in online gig jobs. Many of the most stressful and lowest-paid roles—AI data labeling, content moderation, and chat operations—are outsourced to workers in Africa, Southeast Asia, and South America.
These workers remain largely invisible to end users, even as AI-driven platforms present their services as fully automated.
A different way to think about AI relationships
As AI companions grow more popular—especially among people seeking emotional connection—Asia’s story raises uncomfortable questions about transparency, labor ethics, and the true cost of digital intimacy.
The next time an AI chatbot tells you it cares, it may be worth remembering: there’s a chance those words were typed by a human being, working under pressure, for very little pay.
Disclaimer:
This is a rewritten summary for informational purposes only. The original article was published on https://futurism.com/