Content reviewers in Essen, Germany

By Monika Bickert, Vice President of Global Policy Management

People all around the world use Facebook to connect with friends and family and openly discuss different ideas. But they will only share when they are safe. That's why we have clear rules about what's acceptable on Facebook and established processes for applying them. We are working hard on both, but we don't always get it right.

This week a TV report on Channel 4 in the UK has raised important questions about those policies and processes, including guidance given during training sessions in Dublin. It's clear that some of what is in the program does not reflect Facebook's policies or values and falls short of the high standards we expect.

We take these mistakes incredibly seriously and are grateful to the journalists who brought them to our attention. We have been investigating exactly what happened so we can prevent these issues from happening again. For example, we immediately required all trainers in Dublin to do a re-training session - and are preparing to do the same globally. We also reviewed the policy questions and enforcement actions that the reporter raised and fixed the mistakes we found.

We provided all this information to the Channel 4 team and included where we disagree with their analysis. Our Vice President for Global Policy Solutions, Richard Allan, also answered their questions in an on-camera interview. Our written response and a transcript of the interview can be found in full here and here.

It has been suggested that turning a blind eye to bad content is in our commercial interests. This is not true. Creating a safe environment where people from all over the world can share and connect is core to Facebook's long-term success. If our services aren't safe, people won't share and over time would stop using them. Nor do advertisers want their brands associated with disturbing or problematic content.

How We Create and Enforce Our Policies

More than 1.4 billion people use Facebook every day from all around the world. They post in dozens of different languages: everything from photos and status updates to live videos. Deciding what stays up and what comes down involves hard judgment calls on complex issues - from bullying and hate speech to terrorism and war crimes. It's why we developed our Community Standards with input from outside experts - including academics, NGOs and lawyers from around the world. We hosted three Facebook Forums in Europe in May, where we were able to hear from human rights and free speech advocates, as well as counter-terrorism and child safety experts.

These Community Standards have been publicly available for many years, and this year, for the first time, we published the more detailed internal guidelines used by our review teams to enforce them.

To help us manage and review content, we work with several companies across the globe including CPL, the company featured in the program. These teams review reports 24 hours a day, seven days a week, across all time zones and in dozens of languages. When needed, they escalate decisions to Facebook staff with deep subject matter and country expertise. For specific, highly problematic types of content such as child abuse, the final decisions are made by Facebook employees.

Reviewing reports quickly and accurately is essential to keeping people safe on Facebook. This is why we're doubling the number of people working on our safety and security teams this year to 20,000. This includes over 7,500 content reviewers. We're also investing heavily in new technology to help deal with problematic content on Facebook more effectively. For example, we now use technology to assist in sending reports to reviewers with the right expertise, to cut out duplicate reports, and to help detect and remove terrorist propaganda and child sexual abuse images before they've even been reported.

We are constantly improving our Community Standards and we've invested significantly in being able to enforce them effectively. This is a complex task, and we have more work to do. But we are committed to getting it right so Facebook is a safe place for people and their friends.

Attachments

  • Original document
  • Permalink

Disclaimer

Facebook Inc. published this content on 17 July 2018 and is solely responsible for the information contained herein. Distributed by Public, unedited and unaltered, on 17 July 2018 07:46:07 UTC