Terrorist groups have found a home on smaller, less well-known online platforms in recent years where they store, share, and link to content such as violent beheading videos and recruitment propaganda.
Those platforms have struggled to deal with the problem due to a lack of resources and expertise, but a new tool being built by a Google subsidiary in collaboration with a terror-tracking NGO is seeking to solve that problem.
Launched in Paris on Friday, Altitude is a free tool built by Jigsaw—a unit within Google that tracks violent extremism, misinformation, and repressive censorship—and Tech Against Terrorism, a group that seeks to disrupt terrorists’ online activity. The tool aims to give smaller platforms the ability to easily and efficiently detect terrorist content on their networks and remove it.
The project is also working with the Global Internet Forum to Counter Terrorism, which is an industry-led group founded in 2017 by Facebook, Microsoft, Twitter, and YouTube that hosts a shared database of image hashes—a kind of digital fingerprint—of terrorist content.
After years of missteps and failing to deal with the problem of removing terrorist content from their networks, big tech platforms like Facebook, Google, and X (formerly Twitter) have—with the help of dedicated NGOs and law enforcement—largely removed this content from their networks, with the notable exception of Telegram. As a result, terrorists have moved to less regulated and under-resourced platforms where their presence either goes unnoticed or cannot be dealt with because the companies involved simply don’t have the resources to cope with a flood of takedown requests.
“Islamic State and other terrorist groups didn’t give up on the internet just because they no longer had the megaphone of their social media platforms. They went elsewhere,” Yasmin Green, the CEO of Jigsaw, tells WIRED. “They found this opportunity to host content on file-hosting sites or other websites, small and medium platforms. Those platforms were not welcoming terrorist content, but they still were hosting it—and actually, quite a lot of it.”
While there are some tools on the market that work in a similar way to Altitude, they are prohibitively expensive for a lot of smaller companies. Experts like Green believe that tools like this need to be open source and free of charge.
The new tool can be integrated straight into the backend of whatever platform it is working with. It then connects to Tech Against Terrorism’s own Terrorist Content Analytics Platform, which centralizes the collection of content that has been created by officially designated terrorist organizations. The database allows all the platforms using Altitude to easily check whether a piece of content has been verified as terrorist content.
Altitude will also provide context about the terrorist groups the content is associated with, other examples of this type of material, information on what other platforms have done with the material, and, eventually, even information pertaining to the relevant laws in a particular country or region.
“We are not here to tell platforms what to do but rather to furnish them with all the information that they need to make the moderation decision,” Adam Hadley, executive director of Tech Against Terrorism, tells WIRED. “We want to improve the quality of response. This isn’t about the volume of material removed but ensuring that the very worst material is removed in a way that is supporting the rule of law.”
Tech Against Terrorism works with more than 100 platforms, almost all of which don’t want to be named because of the negative impact on their business of being linked to terrorist content. The type of companies that Tech Against Terrorism works with include pastebins, messaging apps, video-sharing platforms, social media networks, and forums.
For many of these smaller platforms, dealing with takedown requests from governments, civil society organizations, law enforcement, and the platform’s own users can be overwhelming and result in companies going to one extreme or the other.
“Platforms can become easily overwhelmed by the takedown requests, and they either ignore them all or they take everything down,” Hadley says. “What we’re looking for is to try to create an environment where platforms have the tools to be able to properly assess whether they should remove material or not, because it’s imperative to take down terror content, but it’s also really important that they’re not just removing any content because of concerns about freedom of expression.”
The Israel-Hamas war has shown what an important role Telegram continues to play in allowing terrorist groups to spread their messages. While efforts to hold Telegram to account have had limited success in recent weeks, terror content remains accessible, and it is from here that the content is quickly shared on a multitude of other platforms. And this is where the Altitude tool can make a difference, according to Hadley.
“Ideally, the content wouldn’t be put up on Telegram in the first place,” Hadley says. “But given that it is, the next best thing we can do is make sure that other platforms that are being co-opted into this by terrorists are aware of this activity and have the right information to take down the material in an appropriate fashion.”