- ISIS takes advantage of the depth and breadth of social media
- Britain’s Home Office is fighting back with the help of ASI Data Science
- But sceptics are worried about how ISIS will adapt
- ASI has an answer – but it’s a secret!
Social media and video sharing platforms are transforming how we live, but these powerful tools are being co-opted as weapons of mass recruitment by terrorist organisations like the Islamic State of Iraq and the Levant, also known as ISIS or Daesh. To incite hatred and fear, and attract recruits, these bloody-minded killers use platforms like Twitter, Facebook, and YouTube to disseminate their twisted ideology, and experts have been warning for years that the digital landscape had been occupied by their propagandists.
ISIS takes advantage of the depth and breadth of social media
Unfortunately, little could be done. In part, this is because the range of social media sites is simply enormous. As James Temperton writes for Wired, for instance, “Isis reportedly used 400 different platforms to host propaganda it uploaded in 2017, with 145 new ones used from July until the end of the year alone”. But the apparent impotence of regulators and watchdogs is also the result of the sheer volume: on Youtube for instance, 300 hours worth of content is uploaded every minute!
These vast numbers mean that human gatekeepers just can’t keep up, and that’s given the bad guys a lot of space to recruit. But now, advances in AI and machine learning are turning the tide, and a new tool unveiled by the UK government promises to curtail ISIS’s digital freedom.
Britain’s Home Office is fighting back with the help of ASI Data Science
Britain’s Home Office has partnered with ASI Data Science to develop a smart social media watchdog that can detect ISIS propaganda videos with a startling 99.995% accuracy. In practice, this means that it can pass the few videos about which it’s unsure to its human helpers, and for every million videos, the Home Office reports, that would mean just 50 needed review. It’s possible, then, for a relatively small team of trained experts to coordinate with the AI, ensuring that nothing slips through.
That’s simply amazing performance, and it’s made possible by some very smart algorithms that have been trained on more than 1,000 propaganda videos. The details are secret, of course, but the AI driving this system can tell the difference, say, between an Al Jazeera video report and a hate-filled video screed.
But sceptics are worried about how ISIS will adapt
Not everyone’s convinced, however, and sceptics like Charlie Winter, a senior research fellow at the International Centre for the Study of Radicalisation, warn that ISIS could adapt and overcome this new barrier. Just by changing their style, he worries, the algorithm might not pick up on content it should flag. That’s because, as Temperton explains, the ‘branding’ of these propaganda videos is essentially uniform, and from the background music to the images, there’s very little variety. There might be good reason to worry, then, about what would happen if this were to change.
ASI has an answer – but it’s a secret!
But John Gibson, the head of data science consulting at ASI, wants to calm these suspicions. As he tells Temperton, “We’ve been very thoughtful about trying to identify characteristics of the propaganda that are very difficult for Isis to change … It’s something we’ve thought about a great deal and clearly for this thing to work well it needs to be adaptive and it needs to be able to keep up to date as the threat evolves.” While ASI can’t talk openly about how exactly the algorithms do their thing, they do say that multiple models work collaboratively to identify which videos are suspect. Better yet, the Home Office is offering this service to any ‘responsible’ parties who want it, meaning that the places ISIS will have to recruit will shrink every day.
The goal of the Home Office is to eradicate this social media threat the world over, and as AI and machine learning improve, they’re offering an effective weapon in the fight against terrorism.