Advertisement

technologyTechnology

Facebook’s Secret List Of Banned Content Has Finally Been Published

author

Dr. Alfredo Carpineti

author

Dr. Alfredo Carpineti

Senior Staff Writer & Space Correspondent

Alfredo (he/him) has a PhD in Astrophysics on galaxy evolution and a Master's in Quantum Fields and Fundamental Forces.

Senior Staff Writer & Space Correspondent

clockPublished

Rawpixel.com/Shutterstock

On Tuesday, Facebook released its Community Standards rulebook, which it uses to moderate what is posted by the over 2 billion people that use the social network every month. The guidelines hint at the complexity of the role of moderator. There are vague pronouncements that allow the moderator to use their judgment as well as very detailed and specific scenarios that make you wonder when they had to deal with such a situation.

The community guidelines are divided into six main sections and 22 sub-sections that cover the various reasons a post might be banned. The whole thing has obvious and welcome points, but there are also times it will be difficult to determine where to draw the line.

Advertisement

In the first section, violence and criminal behavior, Facebook states that it works to block pages and individuals showcasing terrorist activity, organized hate, mass or serial murder, human trafficking, and organized violence or criminal activity, as well as people who praise individuals or groups involved in violence and/or criminal behavior. Among the many examples, Facebook doesn’t allow the sale of firearms and non-medical drugs, statements of harm against individuals, or poaching endangered species.

The second section covers safety. Facebook goes into a lot of detail about its efforts to keep Facebook a safe environment. They remove content that encourages suicide and self-harm and they work hard to tackle sexual exploitation of children and adults. They also mention bullying and harassment, which is forbidden against private individuals but allowed for public figures as long as it is an attempt at humor or satire. The standards don’t provide details on what makes a person a public figure or what's considered satire.

Public figures also feature in the following section, which is dedicated to objectionable content, in particular adult nudity and sexual activity. Facebook doesn’t allow visible genitalia, a visible anus, or visible buttocks unless photoshopped onto a public figure. Imagine being in the meeting where this was decided.

The policy comments on female nipples, allowing them to be shown in the context of breastfeeding, birth-giving, and after-birth moments, health, or an act of protest. Obviously, all examples of sexual activity are a big no-no and they are described in detail in the guidelines.

Advertisement

The rest of the section focuses on graphic violence, cruel and insensitive material, and hate speech. Hate speech is a crucial issue for online communities. Facebook approaches this with a three-tier system that focuses on slurs, threats, and general hate. However, when it comes to fighting hate speech, how it's done matters more than good intentions in guidelines. 

The fourth and fifth sections focus on integrity and authenticity, discussing their commitment to limit spam, increase the number of real people on Facebook, and to challenge the growing number of lies and misrepresentation of facts spread as news. Facebook also discusses how it works to protect intellectual property. For example, your photos are your own and they do not belong to Facebook.

“You own all of the content and information that you post on Facebook, and you control how it is shared through your privacy and application settings,” the guidelines read.

The final section is about content-related requests and how the company complies with them, whether that's requesting the removal of your own account, the removal of deceased individuals from a verified immediate family member, or the removal of an incapacitated user from an authorized individual. They also discuss the additional protection they have put in place for the protection of minors.

Advertisement

The details of the guidelines will certainly spur lots of opinions about the good, the bad, and the things that need improvement – and Facebook would like to listen to that. In May, the company is launching "Facebook Forums: Community Standards", a series of public events in Germany, France, the UK, India, Singapore, the US, and other countries where they hope to get people’s feedback on these rules directly.


ARTICLE POSTED IN

technologyTechnology
  • tag
  • facebook,

  • Community Standard,

  • Hate Speech

FOLLOW ONNEWSGoogele News