EletiofeAuthoritarian Regimes Could Exploit Cries of ‘Deepfake’

Authoritarian Regimes Could Exploit Cries of ‘Deepfake’

-

- Advertisment -

A viral video shows a young woman conducting an exercise class on a roundabout in the Burmese capital, Nyapyidaw. Behind her a military convoy approaches a checkpoint to go conduct arrests at the Parliament building. Has she inadvertently filmed a coup? She dances on.

The video later became a viral meme, but for the first days, online amateur sleuths debated if it was green-screened or otherwise manipulated, often using the jargon of verification and image forensics.

For many online viewers, the video captures the absurdity of 2021. Yet claims of audiovisual manipulation are increasingly being used to make people wonder if what is real is a fake.

At Witness, in addition to our ongoing work to help people film the reality of human rights violations, we’ve led a global effort to better prepare for increasingly sophisticated audiovisual manipulation, including so-called deepfakes. These technologies provide tools to make someone appear to say or do something they never did, to create an event or person who never existed, or to more seamlessly edit within a video.

The hype falls short, however. The political and electoral threat of actual deepfakes lends itself well to headlines, but the reality is more nuanced. The real reasons for concern became clear through expert meetings that Witness led in Brazil, South Africa, and Malaysia, as well as in the US and Europe, with people who had lived through attacks on their reputation and their evidence, and professionals such as journalists and fact-checkers charged with fighting lies. They highlighted current harms from manipulated nonconsensual sexual images targeting ordinary women, journalists, and politicians. This is a real, existing, widespread problem, and recent reporting has confirmed its growing scale.

Their testimony also pinpointed how claims of deepfakery and video manipulation were being increasingly used for what law professors Danielle Citron and Bobby Chesney call the “liar’s dividend,” the ability of the powerful to claim plausible deniability on incriminating footage. Statements like “It’s a deepfake” or “It’s been manipulated” have often been used to disparage a leaked video of a compromising situation or to attack one of the few sources of civilian power in authoritarian regimes: the credibility of smartphone footage of state violence. This builds on histories of state-sponsored deception. In Myanmar, the army and authorities have repeatedly both shared fake images themselves and challenged the veracity and integrity of real evidence of human rights violations.

In our discussions, journalists and human rights defenders, including those from Myanmar, described fearing the weight of having to relentlessly prove what’s real and what is fake. They worried their work would become not just debunking rumors, but having to prove that something is authentic. Skeptical audiences and public factions second-guess the evidence to reinforce and protect their worldview, and to justify actions and partisan reasoning. In the US, for example, conspiracists and right-wing supporters dismissed former president Donald Trump’s awkward concession speech after the attack on the Capitol by claiming “it’s a deepfake.”

There are no easy solutions. We must support stronger audiovisual forensic and verification skills in the community and professional leaders globally who can help their audiences and community members. We can promote the widespread accessibility of platform tools to make it easier to see and challenge the perennial mis-contextualized or edited “shallowfake” videos that simply miscaption a video or do a basic edit, as well as more sophisticated deepfakes. Responsible “authenticity infrastructure” that makes it easier to track if and how an image has been manipulated and by whom, for those who want to “show their work,” can help if developed from the start with a consciousness of how it could also be abused.

We must also candidly acknowledge that promoting tools and verification skills can in fact perpetuate a conspiratorial “disbelief by default” approach to media that in fact is at the heart of the problem with so many videos that in fact show reality. Any approach to providing better skills and infrastructure must recognize that conspiratorial reasoning is a short step from constructive doubt. Media-literacy approaches and media forensic tools that send people down the rabbit hole rather than promoting common sense judgement can be part of the problem. We don’t all need to be instant open source investigators. First we should apply simple frameworks like the SIFT methodology: Stop, Investigate the source, Find trusted coverage, and Trace the original context.

Political opportunism also thrives on panic. Deepfake fears will be used to justify authoritarian “fake news” laws globally or the co-opting of approaches like authenticity infrastructure to make them reinforce power and repress our voices, rather than challenge misinformation and disinformation.

While “seeing is believing” no longer holds the weight it once did, it shouldn’t be the default to assume “seeing is not believing.” Not everything is deepfaked. What you are seeing may be true. And one reason to hold onto that is because the reality of the rights violations that are now occurring in Myanmar need to be recognized and taken seriously.

In this case, sadly, the exercise instructor was dancing to an anti-authoritarian anthem while the military took over in real time in the background. Thank goodness, because of this video, more people know that this coup is happening in Myanmar. Don’t look away. It’s getting worse right now, and the real videos matter.


WIRED Opinion publishes articles by outside contributors representing a wide range of viewpoints. Read more opinions here, and see our submission guidelines here. Submit an op-ed at [email protected].


More Great WIRED Stories

Latest news

Tiffany Haddish Reveals She Has Suffered Eight Miscarriages Amid Endometriosis Battle

American comedian and actress, Tiffany Haddish who shared her struggles with endometriosis, has revealed that she suffered eight miscarriages....

Man Declared Wanted For Stabbing Wife To Death And Attempted Murder Of Their Two Children

A manhunt has been launched on a South African man who stabbed his wife to death and attempted to...

Document From School Show Evidence That Yahaya Bello Paid Tuition Fees For Family Members In Advance After Withdrawing $720,000 From The State’s Account

The documents showing that the immediate past governor of Kogi State, Yahaya Bello allegedly paid tuition fees in advance for...

Mohbad’s Toxicology Result Ready As Pathologist Begins Interpretation

The toxicology test to ascertain the actual cause of the death of 27-year-old Afrobeats star, Ilerioluwa Aloba, aka Mohbad,...
- Advertisement -

Google Thinks It Can Cash In on Generative AI. Microsoft Already Has

Alphabet CEO Sundar Pichai is confident that Google will find a way to make money selling access to generative...

There’s a Rare $25 Discount on the Nintendo Switch OLED Right Now

There's a strong chance we will see Nintendo's next-generation console in 2025, but if you just have been itching...

Must read

Tiffany Haddish Reveals She Has Suffered Eight Miscarriages Amid Endometriosis Battle

American comedian and actress, Tiffany Haddish who shared her...

Man Declared Wanted For Stabbing Wife To Death And Attempted Murder Of Their Two Children

A manhunt has been launched on a South African...
- Advertisement -

You might also likeRELATED
Recommended to you