Agency warns about deepfake videos of K-pop group TWICE


The agency of K-pop group TWICE warned about the spread of deepfake videos involving the members. 

South Korea has been grappling with the spread of deepfake porn videos. The police have started investigating reports of deepfake porn videos in schools nationwide including elementary. 

TWICE1.jpg

TWICE (Instagram)

Names of 300 schools in South Korea were posted on a Telegram chatroom for reporting deepfake cases, the Korea JoongAng Daily reported. The police have started their investigation in Seoul, Incheon and South Jeolla. 

In a notice, JYP Entertainment, TWICE’s agency, said, “We are gravely concerned about the recent spread of deepfake (AI-generated) videos involving our artists.”

“This is a blatant violation of the law, and we are in the process of collecting all relevant evidence to pursue the strongest legal action with a leading law firm, without leniency,” the  agency said. 

It warned, “We want to make it clear that we will not stand by while our artists’ rights are violated and will take decisive action to address this matter to the fullest extent possible.”

“A deepfake refers to a specific kind of synthetic media where a person in an image or video is swapped with another person's likeness,” according to a July 2020 article posted on the Massachusetts Institute of Technology (MIT) Sloan School of Management website. 

It added, “The term ‘deepfake’ was first coined in late 2017 by a Reddit user of the same name. This user created a space on the online news and aggregation site, where they shared pornographic videos that used open source face-swapping technology.”

Another report by the Korea JoongAng Daily stated that Koreans are the No. 1 target for deepfake porn creators based on a 2023 report by cybersecurity firm Security Hero. 

“Based on data collected from ten pornographic websites and other video platforms such as YouTube and Dailymotion, the report concludes that Korea is the most targeted country for deepfake pornography,” the news report stated. 

It added, “For deepfake pornographic content based on celebrities, Koreans accounted for 53 percent of all content identified in the study, a figure well north that of the United States, the second-most vulnerable at 20 percent. Japan and Britain followed at ten and six percent, respectively. Korean celebrities accounted for the seven most-affected individuals, and eight of the top ten. Deepfake content created using images of the most targeted individual has surpassed five million views, with a total of 1,595 videos produced.”