Monica Bickert, Vice President of Facebook policy products explained that the company made a decision to publish an internal document on "community standards", according to the online edition of the Chronicle.info with reference for a New time.
"Today we're going one step further and publishing the internal guidelines we use to enforce those standards".
Another big change in Facebook's user policy is that users can now approach the company if they feel that content has been removed unfairly. This will lead to a review by a (human) member of Facebook's team, typically within 24 hours.
Facebook will then get to work on other violation types, giving people the change to provide more context. Facebook's decision to release its full content guidelines could be seen as an attempt to be more transparent about its operations while it's under a microscope by multiple governments and privacy groups. According to an official blog post, the guidelines have been published to help people understand where the company stands on nuanced issues.
The policy is an evolving document, and Bickert said updates go out to the content reviewers every week.
We do not allow hate speech on Facebook because it creates an environment of intimidation and exclusion and in some cases may promote real-world violence. Even CEO Mark Zuckerberg admitted "we won't prevent all mistakes or abuse, but we now make too many errors enforcing our policies and preventing misuse of our tools". The content policies take a notably hard line on hate speech, defining it as "a direct attack on people based on what we call protected characteristics - race, ethnicity, national origin, religious affiliation, sexual orientation, sex, gender, gender identity and serious disability or disease".More news: CNN Explains Why Arizona Election Is Actually Good News for Democrats
More news: Donors Pledge $4.4 Billion in Aid for Syria, Neighboring Countries
More news: Judge revokes $2M bond for Waffle House suspect
Under the "Graphic Violence" section, for example, Facebook says users should not share images of violence against people or animals with comments or captions by the poster containing enjoyment of suffering or humiliation, an erotic response to suffering, remarks that speak positively of the violence or remarks indicating the poster is sharing footage for sensational viewing pleasure. A shorter version as previously made available to the public, but the latest release includes far more detail than previously reported.
"At the time they told us they could not do it, they would not do it, and actually stopped engaging at that point", Cyril said.
It is also introducing an appeals process for when people think content has been removed incorrectly. The median time required for takedowns was less than one minute in the first quarter of the year, the company said.
The community standards, which are provided on the company's website, details various words and images that the platform censors. We believe giving people a voice in the process is another essential component of building a fair system.
To engage with communities about what is working and what is not, Facebook is launching a series of forums beginning in Europe and coming to the United States and other countries later this year.
Distributed by APO Group on behalf of Facebook. In addition, she writes, the team seeks input from experts and organizations outside Facebook so that they can better understand different perspectives on safety and expression, as well as the impact of Facebook's policies on different communities globally.