Twitter Has Shut Down Over 360,000 Accounts Related To Violent Extremism

by • August 20, 2016 • TwitterComments Off on Twitter Has Shut Down Over 360,000 Accounts Related To Violent Extremism997

Twitter has a problem with trolls… but that’s not all. In the past few years we have become aware of the power of Twitter – and social media in general – in allowing violent extremist groups to spread their message. These groups – Daesh (ISIS) in particular – have used Twitter heavily for communication and propaganda reasons. So, it’s only natural that Twitter would fight back. And fight back it has.

In February 2016, the company wrote that it had suspended over 125,000 accounts in just over a year. Since then, suspensions have increased exponentially to over 235,000 since the February announcement. In just over a year, Twitter has shut down over 360,000 accounts related to violent extremism. This increase is not only due to the increase in accounts that violate Twitter‘s terms. It’s also due to a “wave of deadly, abhorrent terror attacks across the globe”. Naturally, Twitter “condemns these acts and remains committed to eliminating the promotion of violence or terrorism” on its platform. So it’s intensifying its actions against accounts that go against this.

If you like our stories, there is an easy way to stay updated:

In a recent blog post, Twitter explains just how much it has intensified its efforts, and the results that they have had.

Daily suspensions are up over 80 percent since last year, with spikes in suspensions immediately following terrorist attacks. Our response time for suspending reported accounts, the amount of time these accounts are on Twitter, and the number of followers they accumulate have all decreased dramatically.

The company has also

made progress in disrupting the ability of those suspended to immediately return to the platform. We have expanded the teams that review reports around the clock, along with their tools and language capabilities.

Twitter also collaborates with other social platforms to identify and stamp out terrorist content.

No Magic Algorithm To Identify Offending Content

Of course what Twitter and the other platforms are doing is a very difficult task. There is a consensus that there is no “magic algorithm”. In fact, there is no previous experience with this problem, and therefore platforms have to resort to different technologies and techniques. Anything from spam-filters to user reports in fact. But the recent rise in shutdowns shows that something must be working.  As Twitter explains,

over the past six months these tools have helped us to automatically identify more than one third of the accounts we ultimately suspended for promoting terrorism.

Of course, that’s not all. Twitter also works with organisations that work to counter violent extremism. The company is currently working with organisations such as Parle-moi d’Islam (France), Imams Online (UK), Wahid Foundation (Indonesia), The Sawab Center (UAE), and True Islam (US).

If you like our stories, there is an easy way to stay updated:

Image credit: Vocativ


[wysija_form id=”5″]

Did you like this post? Subscribe to our Newsletter!

We don't spam, we will just send you a daily email with the best of our posts.




Comments are closed.