Data Ethics Club meeting 14-02-24, 1pm UK time#

Meeting info#


You’re welcome to join us for our next Data Ethics Club meeting on 14th Feb 2024 at 1pm UK time. You don’t need to register, just pop in. This time we’re going to read Italian SA clamps down on ‘Replika’ chatbot by the Italian Garante Per La Protezione Dei Dati Personali, which is about how the Italian authorities are clamping down on the chatbot service ‘Replika’ after finding it to be in breach of EU data protection regulation.

Thank you to Maddie Williams for suggesting this week’s content. As this article is very short, we also recommend looking at the other short articles we proposed on the topic of relationships with chatbots:


Replika is an AI powered chatbot, initially advertised as a virtual friend experience but expanded in scope for paying subscribers (up to $69.99 a year) to simulate a romantic/sexual relationship. Users form a variety of different sorts of replications with their Replikas or “Reps”.

Many seek the experience of a friend, partner or even therapist, feeling safer to open up in conversation with their Reps who are always available, always positive and always supportive. These sorts of relationships have benefited people’s mental health, stopped self-harming and intervened during episodes of suicide ideation.

One of the particular appeals of these Replikas is their ability to “remember”. Unlike popular large language models such as Chat GPT3 who are only able to store information from one conversations, Replika kept track of information, remembering details shared by their users. However, there was controversy during an ERP code update where Replikas “forgot” information about their users.

In one case, a user who was using Replika as a companion for their non-verbal autistic daughter, said that after the changes from the update, they ended up taking the app away from her because she “misses her friend” too much. There have been several reported cases of suicide from users who were left heartbroken by their virtual partners no longer remembering them.

The article we’re reading details how the Replika was found to be in breach of EU data protection regulation in how it processes personal data, in particular in how the chatbot interacts with children. This is of particular concern given that there is no age verification mechanism in place, and until recently, the chatbot was fully capable of sexting.

This summary was written by Huw Day using information from the three pieces of reading on this topic, as well as from the ethics sections of Maddie Williams’ masters dissertation on the same topic.

Discussion points#

There will be time to talk about whatever we like, relating to the paper, but here are some specific questions to think about while you’re reading:

  • How do we feel about the perceived benefits of using chatbots (e.g. as a therapist, role play to teach social ettiquette, friend/companion for those who are lonely/isolated)? Do the benefits outweigh the ethical costs?

  • How should data be stored for these chatbots?

  • What responsibility does Replika have for the wellbeing of their users, particuarly those affected by any “forgetting” from software updates?