Seemingly everyone loves YouTube. As of 2016, the online video portal reportedly had more than 180 million users in the U.S. alone, a figure that is expected to grow in 2018. But with that massive reach comes massive issues. In recent months, YouTube has been plagued by major advertisers fleeing after their ads appeared next to offensive videos, while some content creators complained their videos were being unfairly suppressed in search results.
As a result of these myriad issues, parent company Google is pledging to clean up YouTube. In a recent blog post, YouTube CEO Susan Wojicki pledged to hire 10,000 new employees to review videos and remove "problematic content." In recent years, videos showing child exploitation, terrorist propaganda and other offensive images have popped up on YouTube. Some even resulted in criminal prosecutions. Wojicki says the company is also using machine learning to police accounts and comments.
Getting a handle on the behemoth that is YouTube is a massive task for its owners. "There's 180 million people that use YouTube, so it's not like it's a manageable number," says marketing expert Bill Fogarty with The Company. "It's a question of managing the content in such a way that people are not offended by what's on YouTube, and of course everybody's mental structure is such that they decide themselves what is acceptable and what isn't."
The issue of Google trying to police content on YouTube raises other questions, like who decides what content is "problematic." Google has faced similar criticisms over its attempts at policing search results. Fogarty believes there is simply no perfect way to control all of the content while keeping all users and advertisers happy. "The problem is, how do you manage billions of video images on your network...it's really pretty tough," he says. "There are all kinds of problems on any kind of social network that carries advertising. It is not a perfect science by any stretch of the imagination."