Facebook Releases Rule Book For Users

Facebook Inc on Tuesday released the most detailed rule book in its history for the types of posts it allows on its social network on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence. Facebook for years has had “community standards” for what people can post. But only a…”
Moroti Olatujoye
April 25, 2018 1:10 pm

Facebook Inc on Tuesday released the most detailed rule book in its history for the types of posts it allows on its social network on subjects ranging from drug use and sex work to bullying, hate speech and inciting violence.

Facebook for years has had “community standards” for what people can post.

But only a relatively brief and general version was publicly available, while it had a far more detailed internal document to decide when individual posts or accounts should be removed.

Now, the company is providing the longer document on its website to clear up confusion.

The company wants to be more open about its operations, said Monika Bickert, Facebook’s vice president of product policy and counter-terrorism.

“You should, when you come to Facebook, understand where we draw these lines and what’s OK and what’s not OK,” Bickert told reporters in a briefing at Facebook’s headquarters.

Facebook has faced fierce criticism from governments and rights groups in many countries for failing to do enough to stem hate speech.

It is said to have not prevent the service from being used to promote terrorism, stir sectarian violence and broadcast acts including murder and suicide.

At the same time, the company has also been accused of doing the bidding of repressive regimes by aggressively removing content that crosses governments.

It is said to have proved too little information on why certain posts and accounts are removed.

New policies will, for the first time, allow people to appeal a decision to take down an individual piece of content.

Previously, only the removal of accounts, Groups and Pages could be appealed.

Facebook is also beginning to provide the specific reason why content is being taken down for a wider variety of situations.

Facebook, the world’s largest social network, has become a dominant source of information in many countries around the world.

It uses both automated software and an army of moderators that now numbers 7,500 to take down text, pictures and videos that violate its rules.

Under pressure from several governments, it has been beefing up its moderator ranks since last year.

Related Posts

See All