X, the company formerly known as Twitter, has fired its head of threat intelligence, Aaron Rodericks, and four other members of the team responsible for combating disinformation and misinformation, just months before the US Republican primaries mark the beginning of the 2024 American election cycle—and a year in which more than 50 countries around the world go to the polls.
Just weeks ago, the company said in a blog post that it was expanding its “safety and elections” teams. Around this same time, Rodericks announced on his LinkedIn that his team was adding eight new roles in preparation for the 2024 elections, including a team lead and an elections analyst. CEO Linda Yaccarino recently echoed these plans in an interview with the Financial Times. But just a month later, in response to the Information breaking the news of Rodericks’ firing, Musk tweeted, “Oh you mean the “Election Integrity” Team that was undermining election integrity? Yeah, they’re gone.”
Rodericks’ team focused on identifying and shutting down malicious actors on the platform, including those targeting elections, according to a former Twitter employee familiar with the company’s civic integrity work, who spoke to WIRED on the condition of anonymity. Twitter’s specialist elections team was fired in November 2022, Rodericks ended up taking on a lot of that work,” the former employee says. “He was the last man standing.”
Letting Rodericks and his teammates go will “only embolden malicious actors, and make it easier for them to operate on the platform,” the former employee said.
The year 2024 will see elections in more than 50 countries around the world, including the US, Mexico, India, Indonesia, and several European Union countries.
“We’ve never seen such a huge tidal wave of elections in the age of social media,” says Alexandra Pardal, campaigns director at Digital Action, a nonprofit digital rights organization. X is an important platform in many countries, used by politicians, dissidents, human rights defenders, and influence operations, Pardal says. “So to be cutting staff dedicated to protecting elections at a time when we’re going to enter the biggest cycle of elections globally in our lifetimes, it’s extremely alarming.”
According to reporting from The Irish Times, Rodericks was facing disciplinary action after he allegedly liked tweets that were critical of X and of Musk. In a court filing requesting an injunction on the disciplinary process, Rodericks said he posted the new election-related roles on his X account and then received a deluge of abuse from users, with one even saying Rodericks was hiring a “censorship squad.” Though Rodericks said X did nothing to respond to the online abuse, it did begin a disciplinary process alleging that he had “demonstrated hostility” against the company via his likes. (Last year, Musk fired an engineer who criticized him.)
X did not immediately respond to a request for comment about why Rodericks and his team members were let go now, and what impact this would have on the company’s ability to respond to election threats in 2024. Rodericks declined to comment.
The firings also come shortly on the heels of the company rolling back a feature that allowed users in the US, Australia, South Korea, the Philippines, Brazil, and Spain to report tweets containing hate speech or misinformation to the platform. In an open letter released earlier this week, the nonprofit group Reset Australia said that the change had left “Australian users unable to report electoral misinformation,” mere weeks before the country votes in a highly polarized and racially charged referendum on whether to change its constitution to include an Indigenous voice in parliament.
In May, the company withdrew from its voluntary commitment to the EU to fight disinformation on its platform. The EU’s Digital Services Act (DSA) requires social media companies to curb hate speech, disinformation, and illegal content on their platforms. On Tuesday, reacting to a new report on disinformation on social media platforms in Europe, European Commission vice president Vera Jourova said that X was “the platform with the largest ratio of mis- or disinformation posts.”
The teams responsible for trust and safety—which includes content moderation, elections, and mis- and disinformation—have been gutted since Musk’s takeover, according to former Twitter employees. Vijaya Gadde, Twitter’s former legal head who oversaw its trust and safety work, was one of the first executives fired after Musk took the helm in October 2022. She faced a campaign of harassment by Twitter users during her final months at the company, following a hostile tweet by Musk. Most of the company’s trust and safety team, including the employees working on the US 2022 midterm election, were let go in November 2022. Two other senior executives responsible for content moderation, Yoel Roth and Ella Irwin, have also since left the company.
Earlier this year, researchers found that hate speech on Twitter had increased significantly after Musk’s takeover, while an April report from the Atlantic Council found that propaganda accounts from Russia, China, and Iran saw a surge in followers after Twitter changed its policies on state-backed media. The company has also come after critics who have tracked harmful content on the platform, suing or threatening to sue prominent research groups.
“With authoritarianism on the rise, and dozens of countries set to hold high-stakes elections next year, democracy is facing an existential threat,” says Jesse Lehrich, cofounder of the nonprofit Accountable Tech. “And Elon Musk continues to tip the scales in the wrong direction, gutting integrity teams and steamrolling basic safeguards on a platform that sits at the center of the global information ecosystem in ways that have already proven a boon to autocrats and propagandists.”