South Korean authorities called on Telegram and other social media platforms on Wednesday to work together with them in deleting and blocking sexually explicit deepfake content, seeking to appease public and political outrage over the problem.
The steps follow reports by several domestic media outlets that sexually explicit deepfake images and videos of South Korean women were often found in Telegram chatrooms.
A 24-hour hotline for victims will also be set up and the number of regulatory personnel monitoring digital sex crimes will be doubled from the current number of 70, the Korea Communications Standards Commission said.
The Korean National Police Agency also said it will make a seven-month push to crack down on online sex crimes.
The media watchdog plans to set up a consultative body to enhance communication with social media firms about deleting and blocking sexual deepfake content, its chairman, Ryu Hee-lim, told a meeting on the issue.
For companies that don't have offices in South Korea, it wants to set up a face-to-face channel for regular consultation.
"Production, possession and distribution of deepfake sex crime videos are a serious crime that destroys the individual dignity and personal rights," Ryu said.
In addition to Telegram, the commission said it would be seeking cooperation from X as well as Meta's Facebook and Instagram and Google's YouTube.
Telegram said it actively moderates harmful content on its platform including illegal pornography.
"Moderators use a combination of proactive monitoring of public parts of the platform, sophisticated AI tools and user reports in order to remove millions of pieces of harmful content each day," it said in a statement.
The other companies did not respond to Reuters requests for comment.
Criticism of Telegram in South Korea has coincided with the arrest of Pavel Durov, Telegram's Russian-born founder, on the weekend - part of a French probe into child pornography, drug trafficking and fraud on the encrypted messaging app.
The number of deepfake sex crime cases in South Korea has surged from 156 in 2021 when data was first collated to 297 so far this year, with most of the perpetrators being teenagers, according to police.
The victims are usually female and include school students as well as female soldiers in South Korea's military.
This year, South Koreans have made more than 6,400 requests for help from the Korea Communications Standards Commission to have sexually explicit deepfake content taken down. That compares with nearly 7,200 cases last year in which the commission agreed to help take down content.
Telegram is now the main platform of choice for perpetrators of sexually explicit deepfake content, said Kim Yeo-jin, the head of the Korea Cyber Sexual Violence Response Center, adding that police need to do more.
In many cases, victims have been told by police that filing a report wouldn't be effective because the content was on Telegram, making it tough to catch the criminals, she said.
Police officials did not respond to a Reuters request for comment.
Information on how social media firms respond to requests by South Korean authorities is hard to come by.
But police data published by lawmaker Kim Young-bae in 2020 showed that police made seven requests to Telegram via email for help with investigations into digital sex crimes between February and August of that year and Telegram did not respond to any of them.