‘Afraid of social media’: South Korea women remove pictures, videos amid deepfake porn crisis
South Korea’s national assembly on Thursday passed a Bill to punish people possessing, purchasing, saving or viewing deepfake sexual materials and other fabricated videos.
SEOUL: Protests in South Korea continued on Friday (Sep 27), a day after lawmakers passed a Bill to criminalise possessing or watching sexually explicit deepfake images and videos.
Activists said they want tougher measures, justice for the victims, and accountability from the government to tackle the alarming sex crime epidemic plaguing the nation.
South Korea’s national assembly on Thursday passed a Bill to punish people possessing, purchasing, saving or viewing deepfake sexual materials and other fabricated videos with up to three years in prison or a fine of up to 30 million won (US$23,000).
The new law will also increase the maximum sentence for making sexually explicit deepfakes.
The Bill now needs the approval of President Yoon Suk Yeol to sign it into law.
However, these have fallen short of what activists are demanding.
Advocates said the current laws are not enough to fight sex crimes, and that policymakers need to do more with the legal system to effectively bring perpetrators to justice and to deter such conduct.
They added that the issue is even more pressing as a majority of both perpetrators and victims are teenagers.
MOST VICTIMS AND PERPS ARE TEENS
As of last Wednesday, more than 800 police reports have been filed nationwide this year for deepfake-related sex crimes, according to Yonhap.
Police have apprehended 387 suspects, more than 80 per cent of whom are teenagers, added the news agency.
About 60 per cent of the victims involved in cases investigated by the police in the past three years were also minors.
Earlier this month, South Korea police said they launched an investigation into whether encrypted messaging platform Telegram abetted the distribution of deepfake porn, including those underaged.
Telegram channels – one allegedly with more than 220,000 participants – were reportedly being used to share these materials.
Deepfake porn includes explicit content where the faces of individuals are digitally superimposed onto other pornographic images or videos using artificial intelligence.
Perpetrators of deepfake crimes have reportedly used social media platforms such as Instagram to save photos of victims, which were then used to create fake pornographic material.
WOMEN LIVING IN FEAR
Local news media reported some of the explicit content was created, viewed and shared by those who know the victims, including classmates and colleagues.
Protesters CNA spoke to said this has led to an atmosphere of fear and distrust among South Korean women in their own schools and workplaces.
Since the deepfake porn crisis broke out, many women have rushed to remove their pictures and videos from social media.
“I no longer post photos on social media, whether it’s my own pictures or pictures of my friends and family. Teenagers should be more worried, but I don’t think age matters because it becomes dangerous once you are exposed,” one South Korean woman told CNA.
Ms Choi Ji-hyeon is one such protester who has been rallying once a week in Seoul since last month, saying she wants the voices of female university students heard.
“The government needs to step in and take measures to solve this problem at the national level,” said Ms Choi, who is head of the human rights club at the Seoul Regional University.
“But since it is now left to the individual, we female college students have to figure this out on our own. The reality is that we have no choice but to be suspicious of our friends who we used to eat and hang out with.”
She added some of her fellow classmates even visited deepfake porn chat rooms on Telegram to make sure they were not victims.
“I’m (now) more afraid of social media, and the possibility of theft (of my personal information) and being used. We need to educate people about the law,” another protester said.
EFFORTS TO DETER DEEPFAKES
Amid public anger and demand for stronger measures, South Korean President Yoon has called for digital sex crimes to be thoroughly investigated.
Lucas Lee, director of startup Deepbrain AI, said his company launched a deepfake detection system in March, which was developed in partnership with the Korean National Police Agency to help combat such crimes in the country.
“With the advancement of technology… it has become easy to make (such content) and there are now a lot of illegal images,” he said.
“It has become difficult for individuals to distinguish videos or images with the human eye. So, we’re developing and selling solutions that can detect whether they are real or fake.”
The process, which primary techniques include detecting lip-syncing and face-swapping tricks, takes just a few minutes.
Aside from law enforcement, companies – including entertainment agencies – are also turning to similar technologies to detect fake videos and images, after cases of actresses and singers falling victim to such crimes.
Human Rights Watch said online gender-based violence is a widespread problem in South Korea, where judges, prosecutors, police, and lawmakers – a vast majority of whom are men – do not take these crimes seriously enough.
The advocacy group has urged the government to provide comprehensive sexuality education to children and adults, and promote gender equality in the nation.