Home » Liberty » Here’s How Facebook Decides What You Can And Can’t See

Olduvai
Click on image to purchase

Olduvai III: Catacylsm
Click on image to purchase

Post categories

Olduvai
Click on image to purchase

Olduvai II: Exodus
Click on image to purchase

Olduvai
Click on image to purchase

Olduvai II: Exodus
Click on image to purchase

Olduvai
Click on image to purchase

Olduvai II: Exodus
Click on image to purchase

Olduvai
Click on image to purchase

Olduvai II: Exodus
Click on image to purchase

Olduvai
Click on image to purchase

Olduvai II: Exodus
Click on image to purchase

Olduvai III: Cataclysm
Click on image to purchase

Here’s How Facebook Decides What You Can And Can’t See

Here’s How Facebook Decides What You Can And Can’t See

One of Facebook’s overwhelmed content moderators – who reportedly sometimes have just 10 seconds to decide whether or not a piece of content is appropriate for the site’s immense user base or not – appears to have leaked a slideshow outlining the company’s complex rules for governing what Facebook’s 2 billion users can and cannot see to the Guardian.

The Guardian published the presentation in a series of slideshows divided into different topics: Sadism, violence, child abuse…

One of the slides outlining how the company handles depictions of graphic violence appeared with the following editor’s note:

“Some use language we would not usually publish, but to understand Facebook’s content policies, we decided to include it. See for yourself how Facebook’s polices what users post.”

It’s important to remember that Facebook’s moderators remove content “on report only,” meaning that millions of Facebook users could see a graphic image or video – such as a beheading – before it’s removed.

As one report notes, the guidelines “may also alarm free speech advocates concerned about Facebook’s de facto role as the world’s largest censor. Both sides are likely to demand greater transparency.”

Facebook employs about 4,500 “content moderators” but recently announced plans to hire another 3,000, the Guardian reported.

Here are some notable excerpts highlighted by the Guardian:

  • Remarks such as “Someone shoot Trump” should be deleted, because as a head of state he is in a protected category. But it can be permissible to say: “To snap a bitch’s neck, make sure to apply all your pressure to the middle of her throat”, or “fuck off and die” because they are not regarded as credible threats.
  • Videos of violent deaths, while marked as disturbing, do not always have to be deleted because they can help create awareness of issues such as mental illness.

…click on the above link to read the rest of the article…

 

Olduvai II: Exodus
Click on image to purchase

Olduvai
Click on image to purchase

Olduvai II: Exodus
Click on image to purchase

Olduvai III: Cataclysm
Click on image to purchase