(Photo : Photo by Chris McGrath/Getty Images)
YouTube

YouTube has removed more videos than ever for the second quarter of 2020. The company relied more on automated filters than human moderators due to the coronavirus pandemic.

According to YouTube's Community Guidelines Enforcement report released Tuesday, they took down more than 11.4 million videos between April and June of this year. 

In the same period last year, they only filtered out under 9 million videos, said The Verge.

The coronavirus pandemic forced YouTube to reduce human video reviewers, said a CNET report. These people usually filter out videos that do not meet their community terms and guidelines.

Too Little or Too Much Enforcement

Latinos have high involvement in YouTube. They engage actively in the platform 83 percent of the time, according to Think with Google data.

What they see on the platform affects their activities by 52 percent to 61 percent, depending on the activity type. This is why moderating the platform's content is vital in keeping the site safe for everyone to use.

In a blog post, YouTube said its greatly reduced staff left them with two choices: under-enforce their policies or over-enforce them.

"We chose the latter - using technology to help with some of the work normally done by reviewers," the firm said.

The increased numbers were YouTube's way to over-enforce their policies. With this, the company removed three times more content that was suspected of being tied to a violent activity or could potentially harm children.

"This includes dares, challenges, or other innocently posted content that might endanger minors," YouTube explained.

The Need for Appeals

The auto-filter does the same job as a human moderator, but it didn't automatically remove the videos before.

It uses a virtual bank of what can be considered inappropriate for YouTube, but it can still make mistakes in the assessment. Usually, YouTube depends on the auto filters to flag videos and human moderators then assess those videos.

If the videos do violate community guidelines, human reviewers take the video down. Otherwise, they remain on the platform.

Since the video relied more on the auto filter this time, it made sure that content creators can appeal easily. Using this process, staffers can verify if the video really violated their policies.

To speed up the process, the company put more staffers in charge of the appeals process. Of the 11 million videos that were flagged down by the auto filter, only about three percent of them resulted in an appeal. 

Prior Warnings from Google

Back in March, YouTube's parent company Google said it would be extending its work-from-home policy until the end of the year due to the pandemic.

The company warned that measures would mean higher reliance on technology than human reviewers. During this time, they warned online creators that some contents, which would normally be fine on the platform, might be removed in error.

In a TechCrunch report, YouTube said it is allowing technology to remove some content without human review so that the site can work quicker despite lower staffing.

The company's priority at the time was to keep the online space protected.The human moderators have very specific set-ups to work on reviews. If they work outside the controlled environment, chances are user data, and sensitive videos could get exposed by accident.

Of the removed videos last quarter, 3.8 million were taken down due to child safety reasons, 3.2 million were scams and spam content, 1.7 million featured sexual content, 1.2 million were violent, and 900,000 promoted violence. 

Check these out!

Women Who Tech Launches COVID-19 Tech Grant to Help Fund Women-Run Companies
TikTok to Sue US Government Over Trump's Ban on App
US Schools Face Laptop Shortage as Students Return to Classes this Fall