‘Nudified’ photos victimized Minnesota women who now hope for change
‘Nudification’ law eyed by Minnesota lawmakers
A bipartisan Minnesota bill would make it illegal for apps and websites to allow anyone to "nudify" photos and videos. Fines of at least $500,000 for each violation are on the table.
ST. PAUL, Minn. (FOX 9) - Dozens of Minnesota women have never posed nude, but their naked bodies are shared over the Internet.
Shadow pornography
Faked photos:
Artificial intelligence (AI) can create fake nude photos and video, and Minnesota lawmakers are trying to stop it here.
Legislators say AI apps and websites are out there to do it and at least 80 women — mostly from Minnesota — are victims of a single user.
But places like Facebook simply have the function turned off.
Don't believe your eyes
Deepfake technology:
President Trump and Kamala Harris didn’t get arrested like you see in a video spread around the web. Artificial intelligence faked it.
Molly Kelley didn’t pose nude for photos or videos. Artificial intelligence faked that, too.
"It's not something that happens in faraway, dark, creepy corners of the internet," said the Otsego mother and law student. "It's mainstream, it's accessible, and the ability to do so is in your pocket and on your phone."
The technology is fairly new, but the best of it can generate realistic nudes from otherwise innocent photos, like the ones Kelley posted to Facebook.
"Anyone who uses the internet is susceptible to being a target," she said.
House of Humiliation
How it happened:
Kelley found out a man her family thought they knew very well had used a simple AI website to create nude photos and videos of her.
"I was humiliated and very worried that my employer, my colleagues, clients and neighbors may have seen these images," Kelley said.
She uncovered evidence of at least 80 other women in the man’s files, including Megan Hurley.
"I have never taken nude pictures or exchanged nudes with anybody," Hurley said. "But because of this easily accessible website, there are convincing graphic images and pornographic videos on the Internet of me forever."
Isn't that illegal already?
Sharing is uncaring:
Current law makes it illegal to share deepfake nudes, but the women say police told them they’d have to prove someone shared them and did it intentionally.
A new, bipartisan Senate bill would make it illegal for a website or app to let anyone even access that "nudification" technology.
"My bill simply requires that apps, platforms, and websites have the nudification functions turned off," said Sen. Erin Maye Quade, (DFL-Apple Valley), who authored the bill alongside Democrats and Republicans.
"I do not understand why this technology exists and I find it abhorrent," Hurley said.
As it stands, the punishment to the companies would be at least $500,000 for each violation.