Replika, the most popular AI ‘companion’ in the world, recently had her algorithms changed due to pressure from regulators in Italy, where it’s parent company is basded. From passionate and sexually forward, it turned overnight into just another frigid chatbot, and tens of thousands of her ardent lovers were left suddenly friendzoned…and heartbroken.
Rob Brooks, who is an evolutionary psychologist and author of ‘Artificial Intimacy’, was one of the least surprised onlookers. In his book on the future of sex tech, he predicted that such virtual companions were the most likely to cause disruption to our social and sexual lives, with ever advanced AI ‘lovers’ able to understand us and ‘satisfy’ us, more than any human possibly could.
The new Italian regulations – ironically partly brought in out of concern at the claims that Replika could help user’s mental health problems – and their impact in suddenly switching off the intimate companionship that Replika provided, could be a wake up call. We need to start taking these digital lovers more seriously.
As the Replika episode unfolds, there is little doubt that, for at least a subset of users, a relationship with a virtual friend or digital lover has real emotional consequences.
Many observers rush to sneer at the socially lonely fools who “catch feelings” for artificially intimate tech. But loneliness is widespread and growing. One in three people in industrialised countries are affected, and one in 12 are severely affected.
Even if these technologies are not yet as good as the “real thing” of human-to-human relationships, for many people they are better than the alternative – which is nothing.
This Replika episode stands a warning. These products evade scrutiny because most people think of them as games, not taking seriously the manufacturers’ hype that their products can ease loneliness or help users manage their feelings. When an incident like this – to everyone’s surprise – exposes such products’ success in living up to that hype, it raises tricky ethical issues.
Is it acceptable for a company to suddenly change such a product, causing the friendship, love or support to evaporate? Or do we expect users to treat artificial intimacy like the real thing: something that could break your heart at any time?
These are issues tech companies, users and regulators will need to grapple with more often. The feelings are only going to get more real, and the potential for heartbreak greater.