Hateful and violent videos are a sliver of the content YouTube removes
YouTube removed 7.8 million videos and 1.6 million channels in the third quarter of this year, mostly for spreading spam or posting inappropriate adult content, the company said Thursday.
The Community Guidelines Enforcement Report comes amid growing questions, including in a congressional hearing Tuesday, about how YouTube monitors and deletes problematic content from the platform, including videos depicting violent extremism and hateful, graphic content. Such videos remain a small percentage of the overall number that YouTube deletes, but the prevalence of such content has been the subject of congressional scrutiny.
The enforcement report, the fourth of its kind for the Google subsidiary, covers July through September and is the first to break out the reasons for removing videos. It is also the first to report the number of channels removed in their entirety for violating YouTube’s “community guidelines.” Channels are removed when they get three strikes within 90 days, or for a single particularly egregious offense, such as predatory behavior.
The report does not say how many videos get flagged by users as inappropriate but are not removed after moderators review them.
“Finding all violative content on YouTube is an immense challenge, but we see this as one of our core responsibilities and are focused on continuously working towards removing this content before it is widely viewed,” the company said in a blog posted with the release of the report.
The report offers little new insight into how YouTube is managing the large amount of hateful, conspiratorial videos posted to the platform or on its role as a video library for users of Gab.ai and 4chan, sites that are popular with racists, anti-Semites and others pushing extremist ideologies.
The report said 81 percent of videos that end up being removed are first detected by automated systems.





