No menu items!
EletiofeCelebrity Deepfake Porn Cases Will Be Investigated by Meta...

Celebrity Deepfake Porn Cases Will Be Investigated by Meta Oversight Board

-

- Advertisment -

As AI tools become increasingly sophisticated and accessible, so too has one of its worst applications: non-consensual deepfake pornography. While much of this content is hosted on dedicated sites, more and more it’s finding its way onto social platforms. Today, the Meta Oversight Board announced that it was taking on cases that could force the company to reckon with how it deals with deepfake porn.

The board, which is an independent body that can issue both binding decisions and recommendations to Meta, will focus on two deepfake porn cases, both regarding celebrities who had their images altered to create explicit content. In one case about an unnamed American celebrity, deepfake porn depicting the celebrity was removed from Facebook after it had already been flagged elsewhere on the platform. The post was also added to Meta’s Media Matching Service Bank, an automated system that finds and removes images that have already been flagged as violating Meta’s policies, to keep it off the platform.

In the other case, a deepfake image of an unnamed Indian celebrity remained up on Instagram, even after users reported it for violating Meta’s policies on pornography. The deepfake of the Indian celebrity was removed once the board took up the case, according to the announcement.

In both cases, the images were removed for violating Meta’s policies on bullying and harassment, and did not fall under Meta’s policies on porn. Meta, however, prohibits “content that depicts, threatens or promotes sexual violence, sexual assault or sexual exploitation” and does not allow porn or sexually explicit ads on its platforms. In a blog post released in tandem with the announcement of the cases, Meta said it removed the posts for violating the “derogatory sexualized photoshops or drawings” portion of its bullying and harassment policy, and that it also “determined that it violated [Meta’s] adult nudity and sexual activity policy.”

The board hopes to use these cases to examine Meta’s policies and systems to detect and remove nonconsensual deepfake pornography, according to Julie Owono, an Oversight Board member. “I can tentatively already say that the main problem is probably detection,” she says. “Detection is not as perfect or at least is not as efficient as we would wish.”

Meta has also long faced criticism for its approach to moderating content outside the US and Western Europe. For this case, the board already voiced concerns that the American celebrity and Indian celebrity received different treatment in response to their deepfakes appearing on the platform.

“We know that Meta is quicker and more effective at moderating content in some markets and languages than others. By taking one case from the United States and one from India, we want to see if Meta is protecting all women globally in a fair way,” says Oversight Board cochair Helle Thorning-Schmidt. “It’s critical that this matter is addressed, and the board looks forward to exploring whether Meta’s policies and enforcement practices are effective at addressing this problem.”

The board declined to name the Indian and American celebrities whose images spurred the complaints, but pornographic deepfakes of celebrities have become rampant. A recent Channel 4 investigation found deepfakes of more than 4,000 celebrities. In January, a nonconsensual deepfake of Taylor Swift went viral on Facebook, Instagram, and especially X, where one post garnered more than 45 million views. X resorted to restricting the singer’s name from its search function, but posts continued to circulate. And while platforms struggled to remove that content, it was Swift’s fans who reportedly took to reporting and blocking the accounts that shared the image. In March, NBC News reported that ads for a deepfake app that ran on Facebook and Instagram featured the images of an undressed, underaged Jenna Ortega. In India, deepfakes have targeted major Bollywood actresses including Priyanka Chopra Jonas, Alia Bhatt, and Rashmika Mandann.

Ever since deepfakes emerged half a decade ago, research has found that nonconsensual deepfake pornography overwhelmingly targets women—and it has continued to explode. Last year, reporting from WIRED found that 244,625 videos had been uploaded to the top 35 deepfake porn hosting sites—more than any previous year. And it doesn’t take much to make a deepfake. In 2019, VICE found that just 15 seconds of an Instagram story is enough to create a reliable deepfake, and the technology has only gotten more accessible. Last month, a school in Beverly Hills expelled five students who had made nonconsensual deepfakes of 16 of their classmates.

“Deepfake pornography is a growing cause of gender-based harassment online and is increasingly used to target, silence, and intimidate women on- and offline,” says Thorning-Schmidt. “Multiple studies show that deepfake pornography overwhelmingly targets women. This content can be extremely harmful for victims, and the tools used for creating it are becoming more sophisticated and accessible.”

In January, legislators introduced the Disrupt Explicit Forged Images and Non-Consensual Edits, or DEFIANCE Act, that would allow people whose images were used in deepfake porn to sue if they could show that it was made without their consent. Congresswoman Alexandria Ocasio-Cortez, who sponsored the bill, was herself the target of deepfake pornography earlier this year.

“Victims of nonconsensual pornographic deepfakes have waited too long for federal legislation to hold perpetrators accountable,” Ocasio-Cortez said in a statement at the time. “As deepfakes become easier to access and create—96 percent of deepfake videos circulating online are nonconsensual pornography—Congress needs to act to show victims that they won’t be left behind.”

Latest news

7 Best Handheld Gaming Consoles (2024): Switch, Steam Deck, and More

It feels like a distant memory by now, but right before the Nintendo Switch launched in 2017, it seemed...

The Boeing Starliner Astronauts Will Come Home on SpaceX’s Dragon Next Year

NASA has announced that astronauts Barry Wilmore and Sunita Williams will return to Earth next February aboard SpaceX’s Dragon...

How to Switch From iPhone to Android (2024)

Ignore the arguments about which is better, because iPhones and Android phones have far more in common than some...

12 Best Tablets (2024): iPads, Androids, and More Tested and Compared

Tablets often don't come with kickstands or enough ports, so it's a good idea to snag a few accessories...
- Advertisement -

Will the ‘Car-Free’ Los Angeles Olympics Work?

THIS ARTICLE IS republished from The Conversation under a Creative Commons license.With the Olympic torch extinguished in Paris, all...

Lionel Messi will return before MLS playoffs, says Inter Miami coach Tata Martino

Inter Miami head coach Tata Martino said on Friday that Lionel Messi will return to the team's lineup before...

Must read

7 Best Handheld Gaming Consoles (2024): Switch, Steam Deck, and More

It feels like a distant memory by now, but...

The Boeing Starliner Astronauts Will Come Home on SpaceX’s Dragon Next Year

NASA has announced that astronauts Barry Wilmore and Sunita...
- Advertisement -

You might also likeRELATED
Recommended to you