YouTube Turns to Improved Machine Learning in Fight Against Terror Videos


The use of machine learning has, says YouTube, "more than doubled" the number of videos removed that contain extreme or violent content and, even with more than 400 hours of content being uploaded to the platform every minute, has doubled the rate at which extreme content is removed.

"These organizations bring expert knowledge of complex issues like hate speech, radicalization, and terrorism that will help us better identify content that is being used to radicalize and recruit extremists", YouTube stated today.

Google's YouTube said Tuesday that it's increasing efforts to fight terror-related content on its site. Last month, YouTube said the service had already begun redirecting searches for violent and extremist content, sending users to anti-terrorism videos instead. "We will also regularly consult these experts as we update our policies to reflect new trends". While these tools aren't flawless, and aren't right for every setting, in many cases our systems have proven more accurate than humans at flagging videos that need to be removed. YouTube is now working with more than 15 new organizations, such as the No Hate Speech Movement, the Institute for Strategic Dialogue and the Anti-Defamation League.

According to AdWeek, Google will be working with 15 additional non-governmental institutions to help curb "hate speech", and will continue to improve their machine learning in order to censor, demonetize, and de-platform content they deem offensive. Over 75 percent of the videos we've removed for violent extremism over the past month were taken down before receiving a single human flag.

It reaffirmed that more people are needed to enforce policy changes.

More news: Ray Lewis Tells Kap to Keep the Activism Private
More news: Timken Steel Corporation (NYSE:TMST) Shares Sold by Wellington Management Group LLP
More news: Gramercy Property Trust (GPT) Receives $34.00 Consensus PT from Brokerages

"We'll soon be applying tougher treatment to videos that aren't illegal but have been flagged by users as potential violations of our policies on hate speech and violent extremism", the company notes.

Despite this, Google said it will also hire more people to review and enforce its policies on the video sharing site, as part of a package of changes created to placate critics who say it YouTube has become a propaganda channel for terrorist groups.

This will be coming to desktops in the coming weeks, with mobile versions of YouTube to follow suit at a later date. These videos will not qualify for comments and endorsements or be monetized or recommended.

As the company put it in its June announcement, "the uncomfortable truth is that we, as an industry, must acknowledge that more needs to be done".