Skip to Main Content

News and Events

More information, faster removals, more people - an update on what we’re doing to enforce YouTube’s Community Guidelines

By The YouTube Team

In December we shared how we’re expanding our work to remove content that violates our policies. Today, we’re providing an update and giving you additional insight into our work, including the release of the first YouTube Community Guidelines Enforcement Report.

Providing More Information


We are taking an important first step by releasing a quarterly report on how we’re enforcing our Community Guidelines. This regular update will help show the progress we’re making in removing violative content from our platform. By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.

We’re also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged to us for review against our Community Guidelines.

Machines Helping to Address Violative Content

Machines are allowing us to flag content for review at scale, helping us remove millions of violative videos before they are ever viewed. And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam).

Highlights from the report -- reflecting data from October - December 2017 -- show:


  • We removed over 8 million videos from YouTube during these months.1 The majority of these 8 million videos were mostly spam or people attempting to upload adult content - and represent a fraction of a percent of YouTube’s total views during this time period.2
  • 6.7 million were first flagged for review by machines rather than humans
  • Of those 6.7 million videos, 76 percent were removed before they received a single view.


For example, at the beginning of 2017, 8 percent of the videos flagged and removed for violent extremism were taken down with fewer than 10 views.3 We introduced machine learning flagging in June 2017. Now more than half of the videos we remove for violent extremism have fewer than 10 views.

YouTube releases Community Guidelines Enforcement Report

The Value of People + Machines

Deploying machine learning actually means more people reviewing content, not fewer. Our systems rely on human review to assess whether content violates our policies. You can learn more about our flagging and human review process in this video:

Last year we committed to bringing the total number of people working to address violative content to 10,000 across Google by the end of 2018. At YouTube, we've staffed the majority of additional roles needed to reach our contribution to meeting that goal. We’ve also hired full-time specialists with expertise in violent extremism, counterterrorism, and human rights, and we’ve expanded regional expert teams.

We continue to invest in the network of over 150 academics, government partners, and NGOs who bring valuable expertise to our enforcement systems, like the International Center for the Study of Radicalization at King’s College London, Anti-Defamation League, and Family Online Safety Institute. This includes adding more child safety focused partners from around the globe, like Childline South Africa, ECPAT Indonesia, and South Korea’s Parents’ Union on Net.

We are committed to making sure that YouTube remains a vibrant community with strong systems to remove violative content and we look forward to providing you with more information on how those systems are performing and improving over time.


1 This number does not include videos that were removed when an entire channel was removed. Most channel-level removals are due to spam violations and we believe that the percentage of violative content for spam is even higher.


2Not only do these 8 million videos represent a fraction of a percent of YouTube's overall views, but that fraction of a percent has been steadily decreasing over the last five quarters.


3This excludes videos that were automatically matched as known violent extremist content at point of upload - which would all have zero views.

Subscribe