YouTube CEO Talks about Increasing Regulatory Staff

Share

Subsequent revelations that YouTube was offering up cartoons featuring what the BBC called "animated violence and graphic toilet humor" added fuel to the controversy.

The news from YouTube's CEO, Susan Wojcicki, followed a steady stream of negative press surrounding the site's role in spreading harassing videos, misinformation, hate speech and content that is harmful to children. It was not immediately clear whether those would be contract workers or actual Googlers.

Read the full story on Silicon Beat. But apparently, embarrassingly, this is obviously rubbish, because the company also somehow needs 10,000 human beings to sit in a factory and make accurate decisions about uploaded video content, because its AI can't, literally, tell the difference between an arse and an elbow fold.

Google is going on a hiring spree to try to stamp out offensive videos and comments on YouTube. The YouTube CEO also revealed plans for launching a new comment moderation tool, and in some cases, shutting down the comments altogether.

Earlier this year, advertisers fled the site after ads appeared next to extremist content. Equally, we want to give creators confidence that their revenue won't be hurt by the actions of bad actors.

More news: Inside the 30000-Square-Foot Starbucks Reserve Roastery in Shanghai
More news: US, China held rare security meeting after North Korea launch
More news: Patriots announce inactives for Week 13 matchup with Bills

In order to combat this issue, the video hosting intends to "apply stricter criteria and conduct more manual curation" while simultaneously boosting its team of human reviewers "to ensure ads are only running where they should".

Since June, according to Wojcicki, YouTube's content moderation staff has manually reviewed almost two million videos for violent extremist content, and as a result, has removed a total of 150,000 videos.

The technology has reviewed and flagged content that would have taken 180,000 people working 40 hours a week to assess, according to Wojcicki.

"It's important we get this right for both advertisers and creators, and over the next few weeks, we'll be speaking with both to hone this approach".

In June YouTube announced that it would take steps to address the problem by improving its use of machine learning to remove controversial videos and by working with 15 new expert groups, including the No Hate Speech Movement, the Anti-Defamation League and the Institute for Strategic Dialogue.

Share