Facebook, in a very interesting exercise of transparency, revealed what are its internal rules that are the basis of the community standards for posts published on the network by users. After many years in which everyone wanted to know how Facebook decides which posts are allowed, or one, within its social network, the Cambridge Analytica scandal brought this very important information to the fore.
Facebook decided in the last year to impose these rules with much greater strictness, so that more and more posts were disabled, or deleted, without people knowing clearly what was the cause of the company's decisions. Because so many questions have appeared in the last year, those from Facebook have decided to publish information on how these standards were established, based on which the communities in the social network are monitored and protected.
Facebook explains the types of posts allowed and blocked on the network
Facebook explains in the press release below how it implements these very important rules, so that now everyone should know what kind of content they can publish, and what content should not be available on the network. Facebook also describes the ways in which a person can attack the American company's decision to block a certain post, so you also know how you can protect yourself against an abusive decision.
At the end, Facebook also describes the ways in which everyone can contribute to the improvement of the standards based on which communications within the social network are protected, apparently also accepting suggestions. Having said that, I recommend that you read very carefully what Facebook has to say.
"For the first time, Facebook is releasing details about its internal rules that form the basis of Community Standards
One of the most frequently asked questions about Facebook is how it decides what kind of content should be allowed on its social platform. This is one of Facebook's most important decisions, as it is an essential element in ensuring that the platform is a safe and open space for diverse viewpoints to be discussed.
Facebook has a set of Community Standards, active for several years, that show what type of content is accepted and what type of content is removed from the social platform. Today, Facebook is taking a step forward and, for the first time, making public the company's internal rules for complying with these standards. The social network is also launching the possibility for users to challenge decisions made on some messages and request a re-evaluation, when they believe a mistake has been made.
Facebook's decision to publish these internal rules is based on two reasons. First, they will help users understand the criteria considered in making a decision regarding nuanced situations. Second, because of these details, it will be much easier for everyone, including experts in different fields, to provide feedback so that these rules – and the decisions made following them – can be improved over time.
The policy development process
Facebook's content policy team is responsible for developing the social network's community standards. Facebook has employees in 11 offices around the world, including experts on issues such as hate speech, child safety and terrorism. Many of them studied expression and safety issues long before they were part of the social platform team.
"I have worked on a wide variety of cases, from child safety to counter-terrorism during my time as a prosecutor, and other members of the team include a former rape crisis counselor, an academic who spent career studying hate organizations, a human rights lawyer and a teacher. Each week, our team solicits input from experts and organizations outside of Facebook so we can better understand different perspectives on safety and free expression, and the impact of our policies on different communities globally.” - said Monika Bickert, Vice President of Global Product Management, Facebook
Based on this feedback, as well as changes in social norms and language, Facebook's standards evolve over time. What has not changed – and will not change – are the fundamental principles of safety, expression and fairness on which these standards are based. In order to start conversations and make connections, people need to know they are safe. Facebook should also be a place where people can express their opinions freely, even if some may not agree with them.
This can be challenging given the global nature of Facebook's service, which is why fairness is such an important principle: the platform aims to apply these standards consistently and fairly to all communities and cultures. These principles are explicitly stated in the preamble to the standards and are brought to life by sharing the reasoning behind each individual policy.
Applying
Facebook's policies are only as good as the determination and accuracy shown in enforcing them — and that's not perfect. The challenge is to identify potential violations of these standards so they can be examined. Technology can help here. A combination of artificial intelligence and user reporting is used to identify posts, images, or other content that may violate Community Standards. These reports are reviewed by the community operations team, working 24 hours a day, in over 24 languages. By the end of 40, there will be 2018 content reviewers – up 7.500% from this time last year.
Another challenge is accurately enforcing policies on flagged content. In some cases, mistakes can occur because Facebook's policies aren't clear enough for content reviewers, and when appropriate, they work to fill in those gaps. However, mistakes happen because people are fallible. Therefore, the evaluation processes are constantly reviewed, to ensure as best as possible the correctness and consistency of the application of the rules. As part of this process, a subset of content is regularly re-evaluated to verify the accuracy of Facebook reviewers' decisions.
Remedies
"Even with the help of our qualitative audits, we know that we are not always right. That's why we've allowed users to ask us to review our decisions when we've removed their profile, Page or Group. Starting today, we are expanding this option and giving people the opportunity to ask for a second opinion on content removed for nudity or sexual activity, messages that incite hatred and violence," says Monika Bickert.
Here's how the process works:
- If a photo, video or post has been removed for violating community standards, the user is given the option to “Request a review”
- Appeals are reviewed by the Community Operations team within 24 hours
- If a mistake was made, the content will be restored and the person who appealed the original decision will be notified.
By the end of the year, this process will be extended to people who report content and are notified that it does not violate Community Standards.
Community Participation and Feedback
Facebook admits that it can improve and refine its Community Standards through feedback from users around the world. In May, Facebook will launch Forums: Community Standards, a series of public events in Germany, France, the UK, India, Singapore, the United States and other countries where user feedback is received directly. More details on these initiatives will be available as they are completed.
As Facebook CEO Mark Zuckerberg said at the beginning of the year: "we won't prevent all mistakes or abuses, but we currently make too many mistakes in terms of enforcing our policies and preventing misuse of our tools." Publish current guidelines for social media enforcement – as well as expansion
the appeals process – will create a clear path, which will be improved over time. They are difficult issues, but the good thing is that there is an evolution about them."