Lawmakers pursue legislation that would make it illegal to share digitally altered images known as deepfake porn

May 23, 2024
2 mins read
Lawmakers pursue legislation that would make it illegal to share digitally altered images known as deepfake porn


Last year, there were more than 21,000 deepfake porn videos online – an increase of more than 460% from the previous year. But Congress may soon make it illegal to share doctored images.

Leading the charge are New Hampshire Sen. Maggie Hassan, a Democrat, and Texas Sen. John Cornyn, a Republican, who have co-authored bipartisan legislation aimed at cracking down on people who share non-consensual intimate deepfake images online. The legislation proposes criminal sanctions that include a fine and up to two years in prison, and civil penalties can be up to $150,000.

“It’s outrageous,” Hassan said. “And we need to make sure our laws keep up with this new technology and that we protect individuals.”

Breeze Liu said she was shocked when a friend discovered her face superimposed over pornographic images.

“And I really feel like my whole world collapsed at that moment,” Liu said. “You have to see how many opinions there are and how many people have violated you. I just didn’t want to live anymore, because the shame was too much for me to bear.”

Liu, who said he knew who the perpetrator of the crime was, decided to take the case to the police.

“The police didn’t really do anything about it,” Liu said. “The police actually called me a prostitute. They shamed me.”

Liu said that when authorities failed to investigate the matter, the perpetrator created more deepfakes of her, creating more than 800 links on the Internet. Liu said the FBI is now investigating her case and she is also part of a class action lawsuit against Pornhub.

Pornhub told CBS News that it quickly removes any non-consensual material from its platform, including deepfakes. The site also said it has protocols to prevent the upload of non-consensual material.

People have also created artificially generated intimate images of celebrities like Taylor Swift. In January, social media site X disabled searches related to the singer in an effort to remove and stop the circulation of deepfake pornographic images of the pop star.

Teenagers across the country are also facing an increasingly common problem. Some students are creating fake pornography of classmates and spreading it among friends and family, sometimes even extorting them. In New Jersey earlier this year, a teenager sued another student, accusing him of creating and sharing pornographic images of them and others generated by AI.

Hassan said Congress is working to criminalize the creation of nonconsensual intimate images.

“There’s work going on in Congress right now on how to set up these kinds of protections, but what we know is that most people don’t know about deepfakes that exist until someone tries to distribute them, right? really attack this problem at the point where it becomes obvious and someone is likely to take action,” Hassan said.

Cornyn said that while it could take months for the bill to clear the Senate, he is confident it will pass with bipartisan support.

“We’re not going to take our foot off the gas,” Cornyn said. “We will continue to press this issue, because then, until the bill is published, there are people taking advantage of the absence of this type of punishment to exploit those who use these deepfakes.”

In the meantime, Liu created a startup called Alecto AI to help others quickly identify and remove deepfakes they find online.

“I came to the conclusion that unless I changed the system, unless I changed the world, justice wouldn’t even be an option for me,” she said.



Source link