YouTube removed 58 million videos in latest quarter
YouTube removed 58 million videos between July and September this year because they broke community guidelines.
More than 7.8 million of those videos were taken down because they violated community guidelines. The other 50.2 million were taken down as YouTube removed 1.67 million channels.
The online video platform said 72 percent of the videos removed for violating guidelines in the latest quarter were “spam or misleading,” 10.2 percent were removed out of concern for “child safety” and 9.9 percent were removed for including “nudity or sexual content,” according to its latest report.
{mosads}The “transparency” report, which was released publicly on Thursday, emerges as YouTube touts its efforts to screen and delete harmful content more often and more efficiently.
YouTube, which is owned by Google, has deleted more than 31 million videos since October 2017, and started releasing “transparency reports” in 2017.
The reports have expanded significantly as YouTube faces intensifying criticism over the deluge of controversial, harmful and violent content on its platform.
According to previous reports on videos removed for violating community guidelines, YouTube deleted more than 7.7 million videos between October-December 2017, more than 9.1 million videos between January-March 2018, and over 7.1 million between April-June 2018. A YouTube spokesman told The Hill that these are expected fluctuations.
This quarter’s report from YouTube includes for the first time information on why videos were deleted, how many channels were taken down and how many comments were deleted.
In order for a video, channel or comment to be removed, a user or machine must first flag it for review.
A team of YouTube employees then assesses whether the content violates the platform’s standards. Google employs tens of thousands of reviewers across its departments, a YouTube spokesman noted.
YouTube deleted over 1.6 million channels between July and September for violating its community guidelines, which bar content including pornography, gratuitous violence, hate speech, cyberbullying, spam, and more. More than 50 million videos were deleted from those channels.
Over 90 percent of the channels deleted in September were terminated for consistently posting spam or adult content.
YouTube in the same quarter removed over 224 million comments, 99 percent of which were first detected by machines.
A YouTube spokesman emphasized to The Hill that YouTube is using machine detection more frequently rather than relying solely on users to flag harmful content.
The report said 81 percent of the more than 7.8 million videos removed during the latest quarter were first detected by machines.
YouTube in August banned conspiracy theorist Alex Jones’s Infowars channel as tech companies began to publicly grapple with the role social media platforms should play in halting the spread of conspiracy theories.
A YouTube spokesman noted that it removes conspiracy theories targeted towards particular groups or people because those videos violate hate speech and harassment policies.
YouTube in the report noted that other videos with “high-risk” content, include videos that compromised child safety or promoted violent extremism, and these were taken down quickly most of the time.
The report showed 90 percent of the videos that fell into those two categories were viewed less than 10 times before being taken down.
YouTube refers to these categories of violations as “low-volume” but “high-risk,” meaning they are posted infrequently but are considered particularly dangerous.
— Updated at 2:43 p.m. to reflect the total number of videos deleted after the termination of channels.
Copyright 2024 Nexstar Media Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed..