- YouTube is promising that 10,000 humans will review videos for inappropriate content in 2018.
- The company is also planning to tighten its criteria on which channels can carry advertising.
- CEO Susan Wojcicki has issued a memo touting these measures following a series of articles pointing out questionable videos on YouTube.
YouTube says that it plans to have 10,000 human beings reviewing videos to screen for clips that violate its policies.
The announcement comes as the Google-owned company looks to bounce back from a string of embarrassing incidents of questionable content being discovered on its site. In several cases earlier this year, advertisers’ ads were found next to hate videos. And more recently a series of popular channels were kicked off the platform for featuring children in variously harrowing states.
This swirl has caused several marketers to pull ads off of YouTube, and caused many in the ad world to wonder whether YouTube can truly police itself.
Now the company appears to be making a concerted effort to do so.
In a blog post on Monday, CEO Susan Wojcicki said that while YouTube’s open nature has done a lot of good in the world, such as fostering a new breed of digital creators, the company needs to put in more mechanisms to filter out objectionable fare.
“I’ve seen how some bad actors are exploiting our openness to mislead, manipulate, harass or even harm,” she wrote.
Thus, Google is tasking more people to review questionable videos on YouTube – and more importantly, help the machine learning tools Google employs to automatically flag such content get better. Ultimately, given YouTube’s scale, the company believes that machines offer the most viable solution long term.
For example, Wojcicki wrote that the company’s existing human review teams have flagged close to 2 million videos for violent extremist content. And now, “98% of the Read More Here