YouTube is finally admitting it has a major problem — and robots won’t be able to fix it.
Google’s video-streaming service said it plans to hire more than 10,000 new employees next year in a scramble to clamp down on the offensive and inappropriate content that has been plaguing its site.
Last month, it was revealed that pedophiles have been posting disgusting comments to videos of scantily clad children. That was after an uproar earlier this year over YouTube videos promoting terrorism and racists leaders like David Duke that had caused big advertisers to flee.
“We need an approach that does a better job determining which channels and videos should be eligible for advertising,” YouTube CEO Susan Wojcicki admitted in one of a pair of blog posts Monday.
The moves come as advertisers, regulators and advocacy groups express ongoing concern over whether YouTube’s policing of its service is sufficient.
YouTube has developed automated software to identify videos linked to extremism and now is aiming to do the same with clips that portray hate speech or are unsuitable for children. Uploaders whose videos are flagged by the software may be ineligible for generating ad revenue.
But amid stepped up enforcement, the company has received complaints from video uploaders that the software is error-prone.
“We’ve heard loud and clear from creators that we have to be more accurate when it comes to reviewing content, so we don’t demonetize videos by mistake,” she said.
In addition, Wojcicki said the company would take “aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether.”
Adding to the thousands of existing content reviewers will give YouTube more data to supply and possibly improve its machine learning software, she said.
YouTube is reviewing its advertising offerings as part of response and it teased that its next efforts could be further changing requirements to share in ad revenue.
YouTube this year updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions.