Periscope is awesome at helping users broadcast live video content across the globe. But with live content – as with any social media content in fact – come trolls, who get the opportunity to spam and abuse. Until now, there was little we could do to avoid this type of behaviour on Periscope.
But today, the company announced a new feature that “empowers the community to report and moderate comments as they appear on the screen.”
Periscope wanted a system that was completely transparent and community-led, letting viewers decide if a comment is ok or not. And of course, the moderation needed to be live.
The new feature was announced in a blog post that also explains how the moderation will work.
During a live broadcast, anyone can report comments as spam or abuse. The viewer who reports the comment will instantly stop seeing any messages from that commenter for the remainder of the broadcast. On a second level, Periscope will select a few random viewers and ask them to vote on whether they believe the comment is spam, abuse, or looks OK to them. Finally, based on the votes, the system will inform the user that they are temporarily blocked from commenting on the specific broadcast. Repeat-offenders could see themselves being blocked for the whole broadcast.
If you like our stories, there is an easy way to stay updated:
This community-led moderation system seems like the perfect democratic approach. By letting viewers decide on whether a person is being spammy or abusive, Periscope is acknowledging the importance of its community, rather than letting a single person – often a brand representative or community manager – decide what can be allowed or not.
I say well done, Periscope!
More from Periscope
Periscope announced this week that it is updating its app with three new features that are designed to increase engagement …
Twitter has announced that creators can now assign chat moderators to their broadcasts in order to keep an eye on …
Twitter is cracking down on abusive accounts on Periscope with a new list of measures that help users themselves deal …