Instagram has expanded its policies banning graphic self-harm and suicide-related content from including fictional graphic depictions like drawings or memes.
Back in February, Instagram strengthened its ban on content related to suicide and self-harm, prohibiting related graphic images on its platform. As a result of this, Instagram has been able to “act on twice as much content as before” – more than 834,000 pieces of content. Furthermore, Instagram was able to also find more than 77% of this content before it was even reported by users.
“This past month, we further expanded our policies to prohibit more types of self-harm and suicide content. We will no longer allow fictional depictions of self-harm or suicide on Instagram, such as drawings or memes or content from films or comics that use graphic imagery. We will also remove other imagery that may not show self-harm or suicide, but does include associated materials or methods.”
The move comes after a meeting between Mosseri and the health secretary of the United Kingdom, to discuss Instagram’s policy on self-harm content, and public outcry following the suicide of 14-year-old schoolgirl Molly Russell, who killed herself after looking up suicide content on Instagram.
Mosseri also explained that the expansion of its policies is “based on expert advice from academics and mental health organisations like the Samaritans in the UK and National Suicide Prevention Line in the US,” with the aim of striking “the difficult balance between allowing people to share their mental health experiences while also protecting others from being exposed to potentially harmful content,” he said.
Following the expansion, accounts that share this content will not be recommended in search or Instagram’s discovery surfaces – this includes Explore.