BTS’ V and Jungkook are in the middle of a deepfake storm on TikTok, and fans are seriously fed up. AI edits of the two, showing them in almost-naked, intimate poses, have spread like wildfire. Sure, they look real but they’re not. People are using AI to swap faces and bodies, then uploading their creations for everyone to see. Some accounts are even tossing out tutorials so others can join in.
We gotta bring bullying back bc why tf you guys are creating fake intimate images of you and bts members????? This is just weird asf pic.twitter.com/jqdVLdFILp
? mel IS SEEING BTS (@HoneybeeMell) May 5, 2026
Fans aren’t having it. What started on TikTok spilled over to X, and the ARMY is calling these edits “creepy” and “disrespectful.” Biggest worry? Both V and Jungkook are super active online, and there’s a real chance they’ll run into this stuff. One disgusted fan wrote, “Why are you creating fake intimate images of you and BTS members? This is just weird.” Another said, “It’s delusional and would make them uncomfortable if they saw it. The fact they’re on TikTok & could see that makes me sick.”
it’s so delusional and would make them uncomfortable if they saw. the fact they’re on tiktok & could see that makes me sick to my stomach ? ?
Courtney ? (@rkonightmare) May 5, 2026
Now, people want HYBE and BigHit Music to step in and shut this down. ARMY’s calling these deepfakes digital harassment, throwing around terms like defamation, invasion of privacy, and non-consensual imagery. Fans are tagging the companies nonstop, demanding they take legal action and hold the creators accountable. One post spelled it out: “These people could get sued for sexual harassment, defamation, invasion of privacy, non-consensual pornography and more.”
To cite some, these ppl could get sued for: S*xual harassment, def*mation, invasion of privacy, non-consensual p*rn*graphy, image rights violation, reputational harm, exploitation, consent violation, digital impersonation, h*rassment, emotional distress, ab*se of likeness etc.. https://t.co/rtZ6KVvri5
Fe.D.G (@DavyWer) May 6, 2026
This whole mess is part of a bigger conversation about AI in fandoms. Tech is everywhere now, and fans say it’s out of control. Making fake sexual images of real people, especially idols without their consent is crossing a line. Like, it doesn’t matter if someone claims it’s “just for fun,” it’s still not okay.The videos are still up, and more keep coming. But ARMY isn’t letting this slide. They’re putting pressure on TikTok and HYBE to do something. Bottom line: AI can make anything, but that doesn’t mean it should. ARMY’s message is pretty clear-fake intimate pics of idols are off-limits.

