“That Isn’t Me!”: A Look at South Koreas Deepfake Porn Crisis By: Nneka Iroha

In South Korea, all intranet images of military and defense ministry officials have been removed, and all ID photos of the Republic of Korea Armed Forces (ROK) officials have been made invisible.[1] This was done in an attempt to prevent deepfake porn as the content floods the country.[2] They have reason to worry. For the second time in just a few short years, South Korean women took to the streets of the country’s capital, to demand an end to sexual violence.[3]When the country spearheaded Asia’s #MeToo movement, the offender was molka, the secret filming of images of a sexual nature.[4] According to the police, the number of molka arrests soared from 1,110 to more than 6,600 in 2014, although the actual number of cases is thought to be much higher.[5] Of the 16,201 people arrested between 2012 and 2017 for molka, 98% were men.[6] And of the 26,000 recorded victims, 84% of them were women.[7]

This time around, the culprit is deepfake pornography.[8] Journalists, social media users, and authorities recently identified a large number of chat groups where members were creating and sharing pornographic deepfake images.[9]This came after news had emerged that police were investigating deepfake porn rings at two of the country’s major universities.[10] Those, mainly teenage students, would upload photos of people they knew, and other users would then turn them into sexually explicit deepfake images.[11] There were groups dedicated to specific high schools and even middle schools.[12] The process is “systematic and organized,” and some groups had as many as 227,000 members.[13]

But what is deepfake technology? Deepfake technology is artificial intelligence that creates realistic fake images, videos, and audio recordings.[14] Deepfakes often take existing content and swap the original person for another.[15]They also have the ability to create completely original content, showing a person doing or saying something they have never done before.[16] Artificial intelligence has increased the online visibility of deepfakes, which, in this case, are used to combine a real person’s face with a fake, sexualized body.[17]

The victims of these crimes are predominantly young women and girls, including students, teachers, and soldiers.[18] In 2023, almost 2/3 of the victims were teenagers.[19] This revelation comes after the arrest of the Russian-born founder of Telegram, a popular social media app, after it was alleged that child pornography, drug trafficking, and fraud were taking place on the encrypted messaging app.[20] The app is known for having a ‘light touch’ moderation stance and has been accused of not doing enough to police content in groups for years.[21]

South Korea has a messy history of digital sex crimes.[22] In 2019, it emerged that a sex ring, later known as Nth Room, was using Telegram to coerce women and create and share sexually explicit images of themselves.[23] At the time, police asked Telegram for help with their investigation, but the app ignored all seven requests.[24] Although the ringleader was eventually sentenced to more than 40 years in jail, no action was taken against the platform due to fears of censorship. [25] Some believe that the sentencing and punishment of the main actors, but overall neglect of the situation, has led to the exasperation seen today.[26]

Large amounts of women and teenagers around the country have since removed their photos from social media or deactivated their accounts altogether.[27] They feel deeply “frustrated and angry” to have to censor their behavior and use of social media when they have done nothing wrong.[28] Police have even told victims that they should not bother pursuing their cases as it would be too difficult to catch the perpetrators, and it was “not really a crime” as “the photos were fake.”[29]

While the exact number of victims is difficult to verify, The National Police Agency said that in a single week, it was investigating 513 cases of deepfake porn where the faces of real women and girls have been digitally superimposed onto a body without their knowledge or consent.[30] Should the current trend continue, South Korea is expected to reach a record high by the end of the year.[31] A record that could sully the country's positive contribution to global pop culture as it ranks as the world's digital sex crime capital.[32]

The government has vowed to impose stricter punishments for those involved, and the president, Yoon Suk Yeol, has called for better education for young men.[33] He believes that while “some may dismiss it as a mere prank, it is clearly a criminal act that exploits technology under the shield of anonymity.”[34] However, his words may ring hollow for some, given that he came to power by enticing young male voters with the proposal to scrap the gender equality ministry, which he accused of treating men like “potential sex criminals.”[35] However, some say that it's hard to believe the government when it dismisses structural gender discrimination as mere personal disputes and manages to have the worst gender pay gap of any wealthy nation in the world.[36] This adds to the pervasive culture of sexual harassment fueled by the thriving tech industry, which has contributed to an explosion of digital sex crimes.[37]

When compared to other nations, South Korea is ahead of the line when it comes to regulating deepfake porn.[38]The country has laws in place, including punishments of up to five years in prison and fines for people convicted of creating images with the intent to distribute them.[39] In comparison, in the United States, federal legislation has garnered bipartisan support among lawmakers but is still going through Congress at a glacier's pace.[40] While regulation is important, the cases in Korea shine light on how difficult enforcement can be when the problem is so widespread, as well as how easily this content can be created and shared.[41]

As for the ROK Military, previous investigations have shown that intranet photos have been abused for deepfakes.[42] The perpetrators send in names of military servicewomen, their ages, ranks, telephone numbers, Instagram handles, and photos of them in uniform, and others then create deepfake photos.[43] Now, only military officials and non-commissioned officers can access the information previously available through the armed forces intranet.[44] However, these Telegram channels are frequently deleted and recreated, and the number of deepfake victims may be far higher.[45]


[1] Alex Blair, South Korea Removes Military Headshots Amid Deepfake Porn ‘Epidemic’, Army Technology (Sept. 9, 2024), https://www.army-technology.com/features/south-korea-removes-military-headshots-amid-deepfake-porn-epidemic/.

[2] Id.; Jean Mackenzie & Nick Marsh, South Korea Faces Deepfake Porn ‘Emergency’, BBC (Aug. 28, 2024), https://www.bbc.com/news/articles/cg4yerrg451o.

[3] Raphael Rashid & Justin McCurry, From Spy Cams to Deepfake Porn: Fury in South Korea as Women Targeted Again, The Guardian (Sept. 13, 2024), https://www.theguardian.com/world/2024/sep/13/from-spy-cams-to-deepfake-porn-fury-in-south-korea-as-women-targeted-again#:~:text=The%20scale%20of%20the%20problem,or%20degrade%20women%20through%20deepfakes.

[4] Id.; Justin McCurry & Nemo Kim, ‘A Part of Daily Life’: South Korea Confronts its Voyeurism Epidemic, The Guardian (Jul 3, 2024), https://www.theguardian.com/world/2018/jul/03/a-part-of-daily-life-south-korea-confronts-its-voyeurism-epidemic-sexual-harassment.

[5] McCurry & Kim, supra note 4

[6] McCurry & Kim, supra note 4

[7] McCurry & Kim, supra note 4

[8] Rashid & McCurry, supra note 3

[9] Mackenzie & Marsh, supra note 2

[10] Mackenzie & Marsh, supra note 2

[11] Mackenzie & Marsh, supra note 2

[12] Jean Mackenzie & Leehyun Choi, Inside the Deepfake Porn Crisis Engulfing Korean Schools, BBC (Sept. 2, 2024), https://www.bbc.com/news/articles/cpdlpj9zn9go.

[13] Id.; Catherine Thorbecke, South Korea is Facing a Deepfake Porn Crisis, Deccan Herald (Sept. 9, 2024), https://www.deccanherald.com/opinion/south-korea-is-facing-a-deepfake-porn-crisis-3182379.

[14] Kinza Yasar, What is Deepfake Technology?, Tech Target (Aug. 2024), https://www.techtarget.com/whatis/definition/deepfake.

[15] Id.

[16] Id.

[17] Blair, supra note 1

[18] Rashid & McCurry, supra note 3

[19] Rashid & McCurry, supra note 3

[20] Mackenzie & Marsh, supra note 2

[21] Mackenzie & Choi, supra note 12

[22] Mackenzie & Marsh, supra note 2

[23] Mackenzie & Marsh, supra note 2

[24] Mackenzie & Marsh, supra note 2; Mackenzie & Choi, supra note 12

[25] Mackenzie & Choi, supra note 12

[26] Mackenzie & Choi, supra note 12

[27] Mackenzie & Choi, supra note 12

[28] Mackenzie & Choi, supra note 12

[29] Mackenzie & Choi, supra note 12

[30] Rashid & McCurry, supra note 3

[31] Rashid & McCurry, supra note 3

[32] Rashid & McCurry, supra note 3

[33] Mackenzie & Marsh, supra note 2; Thorbecke, supra note 13.

[34] Thorbecke, supra note 13

[35] Thorbecke, supra note 13

[36] Mackenzie & Marsh, supra note 2

[37] Mackenzie & Marsh, supra note 2

[38] Thorbecke, supra note 13

[39] Thorbecke, supra note 13

[40] Thorbecke, supra note 13

[41] Thorbecke, supra note 13

[42] Blair, supra note 1

[43] Blair, supra note 1

[44] Blair, supra note 1

[45] Blair, supra note 1

MSU ILR