The 2019 YouTube Crisis from the Advertiser Perspective

christian-wiediger-1320093-unsplash (1).jpg
 

Have you heard about the Youtube pedophile controversy?

On February 17, 2018, YouTuber Matt Watson posted a video on his channel, “MattsWhatItIs,” titled “YouTube is Facilitating the Sexual Exploitation of Children, and it's Being Monetized (2019).” In this video, Mr. Watson exposes what he calls a “wormhole into a soft-core pedophile ring,” where pedophiles are connecting on YouTube in the comment section of videos featuring children. He shows how there are even direct links to websites with child pornography in the comments, and how the YouTube algorithm is perpetuating the problem by connecting these videos and recommending them to viewers. By using a brand new YouTube account, Mr. Watson demonstrates how it only takes two clicks to get into the wormhole, and from then on the only videos recommended are of children. Furthermore, some of these videos are being monetized, meaning that advertisements are running before these kinds of videos.

When advertisers and brands heard about this issue there was a mass exodus from YouTube and major brands such as Disney, McDonalds, and countless others pulled their advertisements, and their money, off of YouTube because they did not want their ads running on videos connected to this controversy. In response to the problem, YouTube’s solution was to disable comments and monetization on any video featuring children where those children could potentially be in compromising positions.

However, this is a policy that YouTube claimed to already be implementing in their blog titled, 5 ways we’re toughening our approach to protect families on YouTube and YouTube Kids, which was released in November 2017. In this article, Johanna Wright, Vice President of Product Management at YouTube, claims that YouTube has an automated and human flagging system to “review [and] remove inappropriate sexual or predatory comments on videos featuring minors… we will begin taking an even more aggressive stance by turning off all comments on videos of minors where we see these types of comments.”

From Matt Watson’s perspective, this does not solve the problem. To him, the fact that YouTube possesses an algorithmic way to identify these videos and inappropriate commenters, they should do more than just disable the comments, which is a view shared by other YouTube influencers (see Colleen Ballinger’s video Body Update & My Thoughts on YouTube 13:46).

apple-blur-communication-533463.jpg

The problem lies in…

YouTube’s lack of regulations, content standards, and transparency with users and with advertisers, which results in a perpetual lack of safety for both parties.

As a platform, YouTube has had several scandals with their algorithm and content; this is not the first, and with the way the platform is handled, it will not be the last. Associate Director of Paid Search at R2C Group Garrett Browne has worked with YouTube as a digital advertiser since 2009 and believes that the problem lies in YouTube’s lack of regulations, content standards, and transparency with users and with advertisers, which results in a perpetual lack of safety for both parties.

A Platform with Potential

In essence, YouTube has the potential to be a great platform for advertisement. Major brands and publishers that have their own YouTube channels and are making YouTube specific content like The Daily Show, The Colbert Report, CNN, ESPN, Sports Illustrated and on down the line. YouTube houses the music videos and Vevo channels of countless mainstream celebrities, garnering millions of views. There are also a new form of celebrity, the “YouTubers” or “YouTube Influencers,” in addition to larger media companies like Buzzfeed Video and Clevver, all who create a wide variety of content. All of this content objectively could provide a positive opportunity for paid advertisement. However, according to Mr. Browne, that opportunity has not been stewarded well of due to YouTube’s hands off approach with the general user and their lack of transparency with advertisers and minimal advertiser controls.

YouTube does have community guidelines by which videos are supposed to abide in order to be posted, but all regulations and safety measures are enforced retroactively. Troublesome content and comments have to be found first before it can be evaluated for removal from the platform. There is nothing built into the platform that would prevent videos containing inappropriate content from being posted and viewed. For example, in January 2018, now-infamous YouTuber Logan Paul posted a vlog titled “Suicide Forest”with footage of himself and his friends disrespecting the body of a suicide victim in the Aokigahara forest in Japan. The video was eventually removed and there were various consequences for the misguided YouTuber, but not before the video gained 6.3 million views in 24 hours. Again, all “safety measures” were retroactive after the video was exposed to the public, and YouTube has put nothing in place to prevent another “Suicide Forest” scandal.

“YouTube needs to figure out what this platform is.”

Because YouTube does not have a clear objective and success on the platform is simply about gaining views, it allows for a disconcertingly wide variety of content on the platform, with everything from video gaming and makeup tutorials, all the way over to children’s educational channels and cartoons, all the way to violent content and soft-core porn. The platform takes on millions of hours of footage without efficiently discriminating against sexually explicit material, hate material, racist material— all of the seedy communities that used to not have a platform can, to some extent, have one through YouTube. Because of that truth, those communities rush to the platform, further perpetuating the problem. As someone who has been advertising on YouTube for a decade, Mr. Browne says this, “YouTube needs to figure out what this platform is. What I want is for YouTube to come out and say ‘Here is our platform, here is the content we want, here is what we don’t want. If you fall into the category of what we don’t want, sorry, best of luck to you in finding another platform to use.”

apple-hand-iphone-34407.jpg

“as an advertiser I have a love hate relationship with YouTube.”

- Garrett Browne

The Advertiser Struggle

It has been Mr. Browne’s experience that YouTube has not made it easy for advertisers to implement precautionary measures to protect their own brands. In spite of the quantity of questionable content on the platform, YouTube relies heavily on confidence in its own algorithm and stands in the way of advertisers making the optimizations necessary to protect their brands. For advertisers, it can take hours and hours of research, creating exclusion lists, implementing proper audience targeting and suitable topics, and looking through placement reports to ensure that ads are running on videos that are brand safe. YouTube would have advertisers know that they do not want them to do any of those things; they deem it best practice to cast a wide net, so advertisers find resistance when trying to implement a managed placement list on YouTube.

A novice who is attempting to advertise on the platform who, for example, might elect to advertise based on an affinity audience of people who are interested in purchasing a new car, and their ad may end up on car related content, but it will also show up on any and every other kind of video, from K-Pop music videos to Japanese gameshows to children’s content, which makes YouTube an unpredictable advertising platform. Moreover, running an ad on irrelevant content ultimately generates lost impressions for the advertiser that will not lead to conversions, but when advertisers make the kinds of specifications to prevent that the platform penalizes them with higher costs per impression.

There are good reasons why brands might not want to put themselves on certain websites, like a Stormfront or a trailer like Bright Burn or even the far reaches of the internet, but without knowing it, a brand could end up in front of that kind of content on YouTube.
— Garrett Browne, Associate Director of Paid Search

In regards to the comment section on YouTube, there are external programs that exist to identify problematic comments and delete them, but they are not built into YouTube. It is Mr. Browne’s professional opinion that the platform would have far fewer scandals and issues if the comment section simply did not exist on YouTube. It is unclear from a brand safety and good business perspective what value the comment section adds to the platform; if anything, it takes away from both.

So what is being done to fix these problems?

In short, not much to be spoken of. Because there is no true competitor with YouTube, there is an element of assurance that no matter what happens on the platform brands will continue to advertise on it. Youtube makes claims of being hard at work making changes to the platform and valuing brand safety, but they never give transparent specifics of what is being done to protect brands or to protect creators. They have not shared any details of their plans to improve the algorithm, the comment section, or the regulations of YouTube, and thus far nothing significant has been done to improve the functionality of any of the above.

YouTube is undeniably a powerful search media tool with millions of viewers and therefore a substantial amount of influence. As an advertiser who has dedicated years of time and effort to work with this platform, it is Garrett Browne’s hope that YouTube will take on a more transparent and regulated approach with its advertisers and its users. From the advertiser perspective, making a change would not only make YouTube a more ad friendly environment, but it would also contribute to keeping YouTube out of the tabloids.