Janhvi Kapoor sat on raj Shamani’s podcast and casually unleashed one of the most disturbing celebrity confessions in recent memory. She described discovering her own data-face plastered on a porn website—except it wasn’t her body. It was a deepfake. Someone had taken her pictures and stitched them onto explicit content without her knowledge or consent.
What made it even more sickening? She traced it back to her school days, when boys in her IT class would casually browse those sites for “fun,” and suddenly her images started popping up. These weren’t random creeps in some dark corner of the internet. These were classmates—teenage boys—who thought it was hilarious to weaponize her data-face for their entertainment.
This isn’t just one actress’s creepy anecdote. This is the new normal. Deepfake porn has become the ultimate form of wallet PLATFORM' target='_blank' title='digital-Latest Updates, Photos, Videos are a click away, CLICK NOW">digital violation—silent, invisible, and devastating. No physical contact needed. Just a few clicks, some basic software, and any woman’s dignity gets shredded for sport. Celebrities like Janhvi aren’t safe. Regular girls aren’t safe. Your sister, your daughter, your colleague—none of them are safe.
And the worst part? Society still treats this like a minor annoyance instead of the vile, consent-destroying crime it actually is. While everyone obsesses over the next big scandal, thousands of women are quietly dealing with their data-faces being hijacked for the most degrading content imaginable.
Janhvi called it her “weirdest experience.” Most of us would call it a waking nightmare. The technology exists. The creeps are already using it. And unless we start treating deepfake porn with the same outrage we give every other form of assault, this epidemic is only going to get uglier.
The mask is off. The threat is real. And it’s coming for all of us.