0

YouTube and the Struggle for Clean Content

8 Flares 8 Flares ×

YouTube has another crisis on its hands. The past year has been a firestorm of content complaints: first, racist videos from top creators like PewdiePie force video censorship for ad compliance. Not long after, brands clamored for more responsibility and transparency about their ads’ place on the platform. Now, another serious blow strikes the video hosting giant right before the holidays.

YouTube HQ

Over the weekend YouTube was stormed by reports of inappropriate comments which wouldn’t be out of the ordinary, except for one thing: the comments were directed at children. There was an estimated 28 total offenses, with disturbing comments posted under videos geared toward children or uploaded by young content creators.  Buzzfeed reported YouTube’s search function under the words”how to have” returned predatory results. Once alerted to the issue, YouTube immediately purged the comments and made an announcement about updated regulations. But the damage was already done as users flooded the comments section with outrage.

YouTube says it’s noticed “a growing trend around content…that attempts to pass as family-friendly, but is clearly not” and will be taking steps to correct the issue. Namely, they aim to de-monetize the offending videos and block “inappropriate comments on videos featuring minors” using a “combination of automated systems and human flagging and review to remove inappropriate sexual or predatory comments on videos featuring minors.” While YouTube’s action plan is solid, it raises questions about the state of content hosting on a global scale.

The Grey Area

Coder on laptop

Moderating content is difficult—doubly so when it comes to “family friendly” content creators. The distinction is hazy. While there are channels specifically aimed at minors—toy unboxings, animated shorts from major networks, nursery rhymes—there’s a market of content minors watch that isn’t always intended for them. For instance, many of the top content creators in the gaming space create fun, upbeat playthroughs popular games. While their content isn’t inherently raunchy, many swear or make the occasional adult joke in their videos.

These creators—whether in beauty, lifestyle or gaming—are usually adults from ages 18-35. However, their audiences vary in age, location, and personality. This disparity usually shows up in the comments section as these segmented audiences mingle to post their opinions on the creator’s content. For the most part, these exchanges are pretty benign. However, parents are often concerned about their children viewing these types of videos, wary of the adult influences they’re being exposed to.

YouTube responded to parent (and advertiser) concerns about questionable content by strictly enforcing it’s creator guidelines. This included less creator swearing and lessened coverage of “controversial” subjects, which prompted the #YouTubeIsOverParty tag on Twitter. Creators—many of which who make their living off of YouTube ad revenue—had videos de-monetized without their knowledge, which hurt their bottom line significantly. And while YouTube has since relaxed its stance for big channels,  there’s still a lot of work to be done.

Finding a Middle Ground

Moderating uploaded content is tricky business. While algorithms seem like an easy solution to ban bad videos, YouTube’s screening process leaves much to be desired. To correct it they’ve implemented human moderation teams designed to help bridge the gap between formula and fan. But these content moderator teams are subjected to some of the most brutal content on the internet, tasked with catching all the unsavory things mainstream YouTube never broadcasts. Among these are videos of animal abuse, political radicalism, and deviant sexual content. In interviews with former YouTube moderators, they cited themselves as hitting a wall “3-5 months into the job,” unable to keep up with the volume of horrible videos they were forced to watch.

Phone user on roof

So, where does the middle ground lie in the struggle for family friendly content? Instead of solely tasking traumatized content moderators to catch each and every horrible video,  the burden should be shared between content moderator, creator and audience.

While moderators would be responsible for clearing the most offensive, non-TOS compliant videos, creators should be able to assign their own ratings and segmentation instructions for broadcast. For instance, if a Let’s Play video features profanity and adult themes, creators should have an “age target” option that screens out children under age 13. As a balance system, audiences would also be able to provide feedback on how appropriate the creator’s “content rating” was in comparison to the video itself.

Sure, it’d be a lot of work to implement, and we’re sure the team at YouTube is busy. But for a site that dominates the video streaming and sharing spaces, constant innovation comes with the territory.

 

Curious about about the future of the big social platforms? Check out our stories on Snapchat and Facebook.

Jasmine Moore

Leave a Reply

Your email address will not be published. Required fields are marked *