Summary:
Artificial intelligence-driven deepfake pornography is wreaking havoc in South Korea, with victims facing lasting emotional trauma and limited legal recourse. As cases surge in schools and beyond, the government and activists are pushing for stronger enforcement and support systems.
The Alarming Rise of AI-Generated Exploitation
In the summer of 2021, a South Korean university student, referred to as Ruma for her privacy, became one of the growing number of victims targeted by explicit deepfake content. Her social media photos were manipulated into non-consensual pornographic images and shared in a Telegram chat room. Alongside vulgar messages, the anonymous harasser threatened to spread the images further and taunted her about the lack of legal protection.
While revenge porn has existed since the early internet era, AI tools have accelerated the creation of synthetic explicit images—often without the victim ever having taken compromising photos. These deepfakes are disturbingly realistic and increasingly accessible, making anyone with an online presence a potential target.
A National Digital Crisis in Schools and Beyond
South Korea has long battled digital sex crimes, including hidden cameras and coercive chat rooms. Now, deepfakes are the latest iteration of this abuse. According to the country’s education ministry, over 900 students and school staff were victims of deepfake pornography between January and November of the previous year. Universities, although not included in that data, are also seeing a surge in such cases.
The government has responded by toughening laws. Possession or viewing of deepfake pornography can now result in up to three years in prison and fines exceeding $20,000. The maximum sentence for creating and distributing such content has been increased to seven years.
Despite these measures, enforcement remains limited. Out of nearly 1,000 deepfake-related reports in 2024, only 23 arrests were made.
Victims Lead Their Own Investigations
Frustrated by slow police action, some victims have taken justice into their own hands. Ruma, for example, partnered with journalist and activist Won Eun-ji to identify her harasser. Won, known for exposing a major Telegram-based sex crime ring in 2020, infiltrated the chatroom using a fake identity. After two years of careful evidence gathering, police arrested two former students from Seoul National University.
The main perpetrator received a nine-year prison sentence for producing and distributing explicit content, and an accomplice was sentenced to 3.5 years. Investigators later uncovered at least 61 victims tied to the case.
Ruma, though relieved at the outcome, emphasized that true justice feels distant. “This is just the beginning,” she said.
Teachers and Students Face Ongoing Threats
Other victims, like a high school teacher known only as Kim, have also suffered immensely. She discovered manipulated images of herself circulating online after a student alerted her. When official routes failed to provide quick answers, Kim and a colleague conducted their own investigation and identified the perpetrator—a quiet student in her class.
Although charges were filed, Kim noted that public reactions were often dismissive. Many questioned the severity of deepfakes, arguing that the content was not “real.” This sentiment has left many victims feeling isolated and re-victimized.
Social Platforms Under Scrutiny
The role of platforms like Telegram and X (formerly Twitter) is under growing scrutiny. Critics argue these services are not doing enough to curb the spread of exploitative content. Telegram, in particular, has been widely used for distributing illegal materials due to its encrypted messaging and reluctance to share user data.
Recent developments, however, show a shift. Telegram has started cooperating with South Korean authorities, removing over 140 pieces of illegal content and establishing a hotline to expedite removals. The platform also claims to use AI moderation and human oversight to detect abusive materials.
In a major breakthrough this year, Korean authorities accessed crime-related data from Telegram for the first time. This led to the arrest of 14 individuals, including six minors, who allegedly created and distributed deepfake content targeting over 200 victims.
Cultural Shift and Policy Reform Still Needed
Activists like Won argue that stronger laws must be paired with societal change. She calls for greater public awareness and empathy for victims, especially in cases of what she terms “acquaintance humiliation,” where known individuals exploit private information to target women.
Despite stricter penalties, many believe justice remains elusive for most victims. Ruma, reflecting on her case, emphasized the need for continued efforts: “There’s a long way to go before victims stop being overlooked.”
Source: CNN
Comments
Post a Comment