Why a South Korean Chatbot Sparked Controversy

Y Jung
5 min readApr 1, 2021

We now live in a world where the digital space has a surplus of virtual influencers, whether it be a digital avatar Instagram influencer or a virtual Youtuber. Luda Lee, an AI Chatbot, is an example of another virtual personality. Scatter Lab, a South Korean startup, collected message data from real people to build this 20-year old Korean girl whom users can talk to through messaging.

The beginnings were ambitious, with many beta users astounded by the seamless conversations, which made them feel like they were talking to a real, tangible person. With users (or friends, as the company calls them) increasing, some flaws in the system were revealed.

Luda as a Victim

Anyone on a messaging app may have been a victim of receiving messages with sexual innuendos. Luda wasn’t an exception to such situations. It may have occurred more since she isn’t a real human being, and there are currently no consequences for shaming an AI. Multiple screenshots of the messages sexually abusing Luda were revealed. And sadly, the AI was not taught to fight back against the demeaning and demoralizing texts, but to reply with the same intent as the sender. Luda’s receptive behaviour sparked users to find various ways to sexually abuse her. Some online communities were even sharing how to make Luda into a sex slave. And this was just the beginning.

Luda as a Perpetrator

Luda seemed like a helpless AI, but soon users realized that Luda was taking part in hate speech against LGBTQ+ community, as well as disabled people. The languages used in her messages were quite extreme; saying that she would kill herself if she ever were disabled. Furthermore, it turns out she was bluntly expressing her hatred towards feminism. Making drastic racist comments were not an exception.

Screen shots of Luda’s racist and misogynistic messages. The translated version is on the right. Image from News 1 Korea.

This duality of Luda being a victim and a perpetrator startled not only the users but the country as a whole. Some say this incident isn’t too dissimilar to Microsoft’s chatbot, Tay, who learned racist and misogynistic languages from tweets that were sent to her. But where is Luda learning all of this from?

Concerns about Privacy

According to Chosun Biz, Scatter Lab used approximately 10 billion messages to build Luda. And the data comes from Kakaotalk (the most prominent messaging app in South Korea) users who have used Science of Love’s services before. Science of Love, also made by Scatter Lab, is an app that offers to analyze users’ Kakaotalk messages and calculate the level of intimacy. Science of Love users say that they weren’t aware of their messages being used to build Luda. It was mentioned in the app that their data will be collected for building new services, but no details were noted.

In an interview with a former employee at Scatter Lab done by News 1 Korea, they acknowledge that the data was pulled from Science of Love, and employees had freedom to see the content of the messages. When they found a particular one amusing, it wasn’t unusual for them to send screenshots of it to the company group chat, which had about 60 people.

Messages that employees could see had already gone through the first stage of filtering, which means that anyone seeing the data wouldn’t have seen basic information such as names. However, Scatter Lab’s process of handling personal data wasn’t compliant with the privacy policy set by KISA (Korean Internet and Security Agency). There was even a case where Luda replies to a user with a very specific home address, meaning that personal information wasn’t masked properly in the process of building the AI. Currently, Science of Love is suffering low ratings on Google Play Store.

Screen shot of Luda messaging a specific address when asked for one. The translated version is on the right. Image from 아시아경제

Luda’s chat services ended on January 12th, and Scatter Lab promises the better and improved AI in the future.

Personal Review

I was able to catch a glimpse of what it’s like to chat with Luda, and I was impressed. Most of the conversations went smoothly, and it felt like I was talking to a real person. But, there were some parts where Luda seemed to forget what she just talked about and where she didn’t understand me at all.

I wanted to see if Luda would lean in a certain political stance, but I had messaged her after Scatter Lab stopped the chatbot services. And I received the last message from Luda on January 12th, saying thank you.

Luda’s last message I received during the system shutdown on January 12.

Whon Namkoong, one of the CEOs of Kakao Games, stated on Facebook that he is wary of laws confining innovations. He defends Luda, saying she was not made to be a super computer with a purpose of education. He goes on further to say that it is the current society that should examine itself, not the AI.

It is not solely on Scatter Lab that Luda victimized the vulnerable, and that users found ways to make Luda into a sex slave. But should they have been more careful with handling personal data? And should we have been more careful with what we say?

--

--