Deepfake Voices Are Getting Too Real—Korea’s Entertainment Industry Pushes Back
- 정훈 신
- 5월 19일
- 2분 분량
In October 2024, a TikTok video went viral in Korea: it featured a flawless cover of BTS’s Spring Day—sung in the voice of Jungkook. The problem? It wasn’t him.
The voice had been generated by an AI tool using just a few minutes of publicly available audio. Within days, dozens of similar covers, interviews, and even fake confessions began appearing online, prompting a heated national debate: how real is too real?

AI voice cloning, once a fringe experiment, is now a mainstream tool. Free and paid platforms allow users to upload clips of anyone’s voice and create highly realistic speech, songs, or messages. In Korea, where the celebrity image is tightly controlled and fan culture is intensely emotional, the implications are uniquely explosive.
“AI voice is not just a novelty—it’s a potential weapon,” says Kim Min-jun, a copyright lawyer representing several entertainment agencies. “We’re seeing unauthorized releases of AI-generated ‘statements’ that sound indistinguishable from real artists.”
The Korean government is currently reviewing a bill that would make it illegal to use someone’s voice without explicit consent, even in parody or fan edits. Dubbed the “Digital Persona Protection Act,” the legislation would also require AI platforms to watermark cloned audio and maintain traceable logs.
Entertainment companies are fighting back, too. HYBE and SM Entertainment have issued cease-and-desist letters to AI developers and warned fans against using celebrity data in model training. Some agencies are considering creating “licensed voice replicas” for official use in audiobooks, games, and promotions—ironically commercializing the very technology they fear.
Public opinion is split. Some argue that fan-created content has long been part of K-pop culture and that AI only enhances artistic expression. Others feel a line has been crossed. “I heard a clip where IU ‘confessed’ to loving another artist,” said one netizen. “It felt like a violation—even if it was fake.”
Academics suggest that Korea may become a global case study in how cultures emotionally invested in celebrities navigate the age of synthetic media. As one ethics professor put it: “When you clone a voice, you're not just copying sound—you're borrowing trust.”
Date: 2024-10-28
Reporter: 박근홍
댓글