The trouble with YouTube

It is safe to say that 2017 wasn’t a good year for YouTube’s reputation. It was dogged by accusations that it’s policies around child exploitation just didn’t cut the mustard. Amongst the various allegations levelled at this giant media platform were claims that it featured sexually exploitative videos of children, children being abused, hate and extremist content. There were also whistle-blowing accusations that there could be up to 10,000 active predatory accounts within the site and the platform’s safeguarding system was not match fit.

If they thought things couldn’t get any worse they were also criticised for video content that purported to be children’s favourites such as Peppa Pig, Paw Patrol and Spiderman, but were in fact video nasties. In one example perpetrators had created and placed online episodes of the Peppa Pig show, editing the storyline and displaying inappropriate content. This included showing Peppa involved in knife attacks, abductions and being savaged in a forest by wild animals. These episodes had been cleverly disguised to closely resemble the real artwork of the official show.

YouTube responded to these various accusations by highlighting that they have robust safeguarding policies in place, removing thousands of videos and terminating numerous accounts and channels. None of this controversary appears to have affected the company’s fortunes however, with the platform raking in an estimated, but unconfirmed 4 to 10 billion US dollars. There is no doubt that used responsibly, YouTube brings pleasure to millions of people daily, with the media platform’s audience now surpassing 1 billion. That however, is exactly why there is a safeguarding problem with YouTube – the sheer volume of content.

What is YouTube?

YouTube is a video sharing media platform created in 2005 by three former PayPal employees. It is now owned by Google who paid $1.65 billion for it in 2006. It currently accounts for over 5% of Google’s advertising revenue. The websites content includes personal videos, music videos, documentaries, educational videos, movie and TV clips/trailers. It also offers live streaming and video blogging. Most of the content on the site is uploaded by individuals and users can view, upload and share videos, as well as comment and rate content. A basic user is only able to upload a 15-minute video, although YouTube can and does extend this time limit by invitation, to those users that have demonstrated a consistent compliance with their online community guidelines. Using and viewing videos is predominantly free, with the media giant relying on the advertising. YouTube also provides a live streaming facility both on its website and mobile App. Whilst there is no limit on posting videos to the website, the live streaming feature on the app is limited to users with 100 subscribers.

How popular is YouTube?

YouTube is huge, the second-most visited website in the world after Facebook. The platform attracts 1.5 billion visitors per month, with 5 billion videos being watched daily and users uploading between 300 and 500 hours of video per minute. It is available in 88 countries and in 76 languages. With that much exposure it is no wonder that YouTube has been credited with having a positive effect on world events, educating our young, breaking down prejudices, opening up cultures and bringing pleasure to an almost global audience. It is worth bearing those positives in mind when examining and providing a balanced argument on the platforms safeguarding record. Unfortunately, when you reach over a billion people, there will always be those that will see it as a platform to incite, exploit and offend others. For every 10 videos about anti-bullying, there will be undoubtedly be someone using the media to bully another. For every 100 videos preaching love and understanding, there will be an individual who wants to advocate intolerance and hate.

On what devices can access YouTube?

YouTube can be accessed through a wide range of devices and platforms with the most common means of access being their website. However, YouTube is of course mobile through its app, available on smartphones (both iPhone and Android) and other app based mobile devices. An estimated 35 to 40% of its daily users access the site this way. It is useful for safeguarding professionals, parents and others who care for children to know that the app can be accessed through other web based medium, not ordinarily associated with apps. These include the Wii, Nintendo, PlayStation and Xbox systems/consoles, as well as some TV streaming boxes/sticks. You don’t need an account to view content, only to upload, save and comment on videos.

What is YouTube’s policy on safeguarding?

YouTube has what it calls ‘Community Guidelines’, a set of rules that they expect their users to abide by. These are laid out on their policies and safety page with a header that points out that when you use YouTube you are joining a worldwide community. They emphasise that using their site involves a level of trust and comes with certain responsibilities. They urge their users to “try to respect the rules in the spirit in which they were created”. It is YouTube’s policy that children under 13 years are not allowed to set up a YouTube account. Their guideline page breaks down the areas they see as key safety areas. These are:

  • Nudity or sexual content.
  • Age-restricted content (videos that contain sexual content such as nudity or dramatized sexual content, but are not pornographic).
  • Harmful or dangerous content (includes videos that encourage dangerous or illegal activities including instructional bomb making, choking games, hard drug use or other acts where serious injury may result.
  • Hateful content (videos promoting or condoning violence against individuals or groups based on race or ethnic origin, religion, disability, gender, age, nationality, veteran status or sexual orientation/gender identity, or whose primary purpose is inciting hatred on the basis of these core characteristics)
  • Violent or graphic content.
  • Terrorist content (incite violence to celebrate terrorist attacks to promote acts of terrorism).
  • Harassment and Cyberbullying.
  • Spam, misleading Metadata (additional information provided on a video) and Scams.
  • Threats (predatory behaviour, stalking, threats, harassment, intimidation, invasion of privacy etc).
  • Impersonation (impersonating a channel or individual).
  • Child Endangerment.

Each area is broken down into subsections covering each topic. We looked at just one – ‘Nudity or sexual content’. The first statement in this section is: YouTube is not for pornography or sexually explicit content. If this describes your video, even if it’s a video of yourself, don’t post it on YouTube”. They then outline what is and isn’t allowed on the site, and therefore might be removed. They have a caveat that states that videos containing non-graphic nudity and other sexual content may be allowed if its primary purpose is educational, documentary, scientific or artistic. Similar provisos appear throughout the rest of their safeguarding headings.

Further down their page and under the heading of ‘Child Endangerment’ they cover the type of activity   that they consider sexualises minors and make it clear that any such videos will “immediately result in an account termination”. They also state that they work closely with law enforcement agencies and will report child exploitation content. This is followed by a message that if a user believes a child is in imminent danger, the user should contact their local law enforcement agency and report the situation.

Next to their Community Guidelines page, sits their Safety Tools & Resources page. This covers topics such as ‘Teen Safety’, ‘Suicide and self-injury’ and also has parent and educator resource pages. Each page contains tools and tips for staying safe on the site. For example, the Teen Safety page provides the following type of advice to teens:

  • Know what type of content to film: When filming videos of your friends, classmates, or other minors, remember that they should never be sexually suggestive, violent, or dangerous.
  • Remember “The Grandma Rule”: Is what you’re filming or posting something you’d want your grandmother, boss, future employer, parents, or future in-laws to see? If not, it’s probably not a great idea to post it. Once a video has been posted online, you never know who might see it. If it’s copied or reposted, you might not be able to remove every copy.
  • Prevent dangerous or uncomfortable situations: Don’t post something just because someone else asked you to. Also, please don’t try to meet anyone you have “met” online without consulting with a trusted adult first.

Sound advice and in keeping with online safeguarding advice from various UK charities and police. YouTube’s guidelines on various safeguarding areas appear to be concise if not sometimes a bit woolly in places. For example, we would have liked to have seen the words “DON’T EVER meet anyone you have met online without consulting with a trusted adult first” rather than YouTube’s “please don’t try to meet anyone”.

For us in the UK though, the organisations and contact details of any support networks that appear on these pages have limited value, for all the information provided is US based. We looked at their Suicide and Self-Injury page. Despite accessing this page through the YouTube GB site, this page signposts anyone who feels suicidal to the US National Suicide Prevention Lifeline with a very long US based number (with no international dialling code). Another organisation listed is Crisis Text Line, where anyone with suicidal thoughts can text the message ‘WELL’ to a number to begin communicating a text conversation with a trained crisis counsellor. Again, the issue is that this number can only be used anywhere in the USA. Neither, are therefore particularly useful to a suicidal young person in any part of the UK. The websites are better, with www.befrienders.org at least being an international based site with a list of UK support organisations.

Whilst on the subject of suicide, we typed into the YouTube search engine “how to commit suicide”. Whilst most results on the first few pages were videos aimed at suicidal prevention, we found many videos that were unnecessary and unpleasant. There were numerous clips showing different people attempting to throw themselves off tall buildings, whilst emergency services or members of the public attempted to dissuade them. These videos were filmed by onlookers on mobile phones and many people would find them distressing. Other videos gave tips on the best suicide methods. Another was called “how to tie a noose in 60 seconds” and although there was no reference to suicide in it, it is interesting that it was returned in the top 10 hits of our suicide search parameter. Also in the top 10 hits was a video entitled “how to kill yourself (Tutorial)”, a 21 second collage of pictures set to music with a guy singing about plugging a toaster in and putting it in the bath, inserting a fork into an electrical socket (the cover shot for the video) and starting your car in a locked garage”. There was certainly nothing educational or artistic about this video.

 Within the Safety Tools & Resources page is the important link to information about ‘Privacy and Settings’. This link leads to a page explaining YouTube’s privacy features and information on how the user can protect their privacy. The settings available are:

  • Public – videos and playlists that can be viewed by, and shared with anyone.
  • Private – videos and playlists that can only be seen by the user and other users, the user chooses to share with.
  • Unlisted – videos and playlists that can be seen and shared by anyone the user shares the video URL link with. This differs from private videos in that the people the person shares the video with doesn’t need a Google Account to view the video.

Other options provided by YouTube include ‘Restricted Mode’ which allows the user to “screen out” adult content, whilst they also provide a ‘Parental Setup’ service which allows an adult to download an app that restricts what a child can search for. There is also a timer function which allows parents to limit screen time by stopping the app after a certain period. The problem with these functions is they are not default settings and have to be enabled.

The third page on the sites policy and safety section is their ‘Reporting & Enforcement’ page. This provides guidance and contains their reporting tool on how to flag and report videos that contain inappropriate content. They provide two methods – 1) flagging a video of concern so that it is reviewed by YouTube staff who to determine whether it has broken the sites Community Guidelines. 2) Making an online report particularly where there is abuse, sexually impropriate content involving children or where multiple videos, or a specific account involved.

Our assessment of YouTube ‘s Safety and Policy pages are that they are pretty in-depth even though they appear to be geared towards an American audience. We also had an issue with the fact that none of this advice is front and centre stage, not even a warning on the front page that visitors to the site may come across videos that contain adult content. The safety pages are accessed through a small unassuming link at the bottom of a side bar on the main page. However, apart from those issues the guidance and policies don’t seem too bad. So, with robust guidelines, a flagging system that ensures that YouTube employees are available 24/7 to review flagged videos, and a reporting tool for videos that raise serious concern, why exactly did they face so much criticism in 2017?

What went wrong in 2017?

It is fair to say that in the UK the subject of child sexual exploitation and online safety has quite rightly been pushed to the safeguarding fore in the last few years. Safeguarding professionals have become more in-tune with the dangers that social media platforms pose to young people, but so too has the UK press and online news providers who see chinks in YouTube’s safety record as newsworthy stories. Not that YouTube are being picked on, for in the week that we started writing this article, the UK news media ran safeguarding stories on Facebook and Google Play Store, the former after 60 of its game apps were infected with malicious ‘AdultSwine’ code which inserted pornographic adverts into apps. Regardless of the company involved, these are stories of interest to the public and in 2017 YouTube had their fair share of them.

Whilst most of the bad news stories seemed to gather pace in the later stages of 2017, in April, following rumours that governmental adverts had started to appear alongside videos containing extremism, our Government decided to temporarily withdraw its advertising campaigns. Government would not be the only high-profile organisation to voice their disquiet at YouTube. By November, Mars, The Guardian, Lidl, Adidas, Hewlett-Packard, TalkTalk, Lidl, BT, Sky and Now TV had all withdrawn adverts from the platform, following concerns that their clips were being found next to videos used by online predators, targeting and exploiting children. This is no surprise because many predators have become adept at using social media and online tools to exploit, abuse and share that abuse with likeminded others. Whilst predators use YouTube to exchange abusive pornographic images of children, many videos are non-pornographic but contain highly sexualised and suggested images/videos of children. There is also a growing trend of posting obscene and sexually explicit comments on innocent videos posted by children.

As well as sexual predators, there is no doubt that YouTube is being used by criminal gangs to recruit vulnerable young people. Whilst many gangs use the platform to glamorise their gangsta style lifestyle, organised criminal networks are also beginning to use it to recruit and groom young people for County Lines drug supply. In August the Mayor of London Sadiq Khan called on YouTube to work harder to remove videos showing gang related violence.

Then came the scandal around Peppa Pig, or to give it the name coined in the US and online – ‘Elsagate’ named after Elsa, the character in Disney’s 2013 ‘Frozen’ movie. Elsagate refers to family and child orientated videos on YouTube and YouTube Kids featuring children’s characters altered to contain violent, sexual or other adult themes. Elsa videos were a popular target whilst other characters included Spiderman, Peppa Pig and even Father Christmas.

Individual videos were not the only issue to cause YouTube problems. They also had problem with their channel service. Two notable cases involved the channels ‘Toy Freaks’ and ‘DaddyOFive’ both of which were eventually removed by YouTube. Toy Freaks, a channel with nearly 8.5 million subscribers featured a man with his two daughters in weird and disturbing situations, such as him force feeding his children food, whilst DaddyOFive featured a man and his wife playing prank videos on their children, some so extreme it led to two of their children being placed in care.

What is clear is that young people like and want to follow individuals who blog through the medium of video, commonly known as Vloggers. For many adults, this will be an unfamiliar term, but for many young people, following a Vlogger is a way of life. One of the UK’s higher profile Vloggers may be known to those of us that are older, through last years ‘I’m a Celebrity, Get Me Out of Here’ series. The individual concerned, was 23-year old Jack Maynard who entered the jungle on the back of his YouTube channel with its 1.2 million subscribers. Now if you missed the first few programmes you might not know Jack, for he didn’t make the eviction stage of the competition. Those who watched from the start will know that Jack was asked to leave the show after media revelations, that in his past he had tweeted racist and homophobic on Twitter. It is also reported that when he was 16 he had posted an inappropriate tweet about rape and had conversed with a 14-year old girl, asking her to send him a photo of her breasts. Whilst all these incidents relate to Twitter, is he really a suitable role model for Vlogging to over a million people, with the majority of his audience being teenage girls?

One of the issues that YouTube has is that it has little control over bad and stupid behaviour by those users that set up their own channels. Toy Freaks and DaddyOFive are examples where channels are set up for a specific purpose e.g. pranks and the shock factor, but in Jack Maynard’s case his channel is based mainly on his own activities (Vlogs) and the challenges that he sets his viewers. It is impossible for YouTube to prevent a video being posted by one of its stars that suddenly crosses the line of moral decency and that is why YouTube has already faced bad press and criticism in the early days of 2018.

On 31st December, Logan Paul, a vlogger and singer with more than 15 million subscribers, posted a video showing himself and his friends in Aokigahara forest. They were there to film a piece called ‘haunted forest’, for Aokigahara is steeped in mythology for being home to ‘the ghosts of the dead’. For this reason, to it is also one of the world’s most prevalent suicide locations. Whilst filming Paul and his companions came across the recently deceased body of a man who had committed suicide by hanging. Whilst Paul notified the authorities of the discovery, he also posted footage on YouTube. Within 24 hours the video had been viewed over 6 million times. However, the worldwide backlash was immediate and after a lengthy silence, YouTube were eventually forced to sever ties with the shamed Vlogger. So how do YouTube respond to all this controversy?

YouTube media relations?

It often appears that YouTube are slow to react to issues and only do so when prompted to do so. In the case of Logan Paul, the criticism was immense and obvious within 48 hours, yet it still took them 10 days to respond. Even the response was viewed as poor, with YouTube declaring that the video had violated their policies and guidelines but still failing to act themselves. It was Paul that removed his own video, after he started to come in for criticism. Whilst YouTube eventually decided to axe Paul’s channels and other projects they had with him, this didn’t happen until 10th January. Some would say that they gave his actions and the impact on others careful consideration, others would say that they dragged their feet and didn’t want to lose one their most valuable stars. Their decision making was even more questionable by the fact that this was the second time in less than 2 months that Paul had come in for criticism over his videos. At the end of November, he had released a contemporary music video, the lyrics and images of which were adjudged by many as promoting the sexual objectification of women.

Putting aside the time that it might take YouTube to react, they do generally respond positively to criticism and content that goes against their guidelines. In response to the controversary around the ElsaGate type videos YouTube promised to clean up both its main platform and sister channel ‘YouTube Kids’. The company then set about terminating channels and thousands of videos. They called this purge a “change to enforcement” and described it as an “evolving challenge”. They also announced that they had expanded their enforcement guidelines and had brought in a policy to age-restrict videos that have “kid-friendly characters dealing with mature themes”. They reminded us that these videos are not easy to identify and that perpetrators tag the videos so that they avoid YouTube’s inbuilt child safety algorithm but reassuring us that they would also use technology to assist them with identification and quickly escalate any potential harmful videos to their ‘human moderators’ (reviewers).

YouTube also reacted positively to the criticism that videos featuring extremist and other potentially illegal content was being placed alongside advertisements by major brands, and in some cases the UK Government. They said that they would undertake a review of their advertisement processes and introduce new rules, mainly around increasing the number of subscribers required before a channel/vlogger could earn advertising revenue. Some would argue that they had no choice to look at the problem when major brands like Mars, Sky and M&S start withdrawing their custom.

Another positive step going forward was announced by YouTube’s CEO Susan Wojcicki who pledged that in 2018, YouTube would increase the number of its human moderators to over 10,000. However, the fact that they need this number of moderators gives us a clue to the potential issues that face those that have responsibility for keeping children safe online.

The trouble with YouTube and the challenges for safeguarding professionals?

YouTube carries over 20 million videos for kids plus nearly 800 million music videos, which means it is offers a huge draw and incentive for young people to visit and use the site. As a safeguarding professional or a person responsible for caring for a child, it is important to realise that, whilst YouTube undoubtedly have a legal and moral obligation to ensure they have enforceable procedures and policies in place protect young people, realistically the real safeguarding has to be done by those responsible for the care of the individual child. You simply cannot guarantee that young people who access the platform will not be exposed to harmful content, for the real trouble with YouTube is the enormous volume of material on the site. With between 300 and 400 hours of content loaded every minute, the scale of the platform is simply too huge for them to police effectively. The flagging system used by YouTube is unequal to the task and creaking under the volume.

The Flaggers – whilst any user can flag a video that violates YouTubes community guidelines, the company relies heavily on their ‘Trusted Flaggers’, a volunteer network imbedded in their ‘Contributors Programme’, offering perks to those volunteers that want to get involved in answering user’s questions in the help forum, subtitling/adding captions to content, and reporting videos violating their guidelines.

An online investigation by the BBC in August last year detailed how the flagger system has buckled amidst the huge volume and reported that there was a huge backlog of flagged reports, with YouTube only able to deal with a small amount of complaints. Within the article the BBC had spoken to trusted flaggers who had highlighted serious issues in not only their system but the public reporting system. It was suggested that there could ‘be up to 100,000 active predatory accounts on the site’ because the system was flawed. The accusation from one of these flaggers was that YouTube had “systematically failed to allocate the necessary resources, technology and labour” to combat the problem and that effectively YouTube was a portal for sexual predators. Positively it appears that when the flaggers reports are acted on, YouTube moderators tend to agree with the trusted flaggers and remove the video or channel. The problem however, remains the volume of flagged reports that these human reviewers have to contend with.

The Human Reviewers – ordinarily YouTube employees will review the flagged content and assess whether the content breaks the platforms guidelines. If found to contravene the rules, they act to remove the inappropriate content. The company also use human moderators and algorithms to rate and analyse content. These ‘raters’ are not there to flag or review content, but according to an investigation by online news site BuzzFeed News, late last year raters were diverted from their normal tasks to support the human reviewers, such was the size of the backlog of reports requiring examination.

The problem remains that despite the algorithms, public flagging systems, community guidelines, trusted flaggers, moderators and human reviewers, the site is still home to inappropriate hidden content. Imagine that a child you are responsible for is with their friends and they are clowning around with their Smartphone. They in ‘what is a Creampie’ into YouTube, which for those unfamiliar with this term is associated with pornography. Finding blatant pornographic videos on YouTube is extremely difficult, so something must be working right there. When we carried out this search, the first few videos related to cooking recipes for cream pies or pie in your face fights. However, YouTube is not completely off the hook, for further down on the page was a video that wasn’t pornographic but featured a teenage boy asking his attractive middle-aged ‘mother’ what a creampie is. The language she used and her explanation to him was clearly overtly sexual in nature and in our view, presented a real concern, given that the boy looked young enough to be 14. In our view this video was an extract from a much longer adult video, the giveaway being that in the bottom corner was the originating website logo – Lethalhardcore.com. This video had been viewed more than twelve thousand times and some of the comments from users about the ‘mother’ were obviously sexual in nature.

Alarmingly the next (self-loading) video that followed, featured Holly Body, a provocatively dressed female adult movie performer. Holly is asked by a male voice (off camera) what a creampie is and she replies, whilst playing seductively with her low-cut top “it’s where I let someone cum in my pussy and up my ass”. So, here we have two ‘hidden’ videos, both sexual in nature, one suggesting an incestual relationship between mother and son, the other with explicit language. Where is the artistic content in that we ask ourselves and why are they available to be viewed?

Carrying out the same search on the Google search engine (or any other for that matter) returned adult material straight away. No recipes or pie fights here, straight into graphic pornographic videos. None of the pornographic content originated from YouTube and their first video available from the search engine was a couple of pages in and was “how to make fried ice cream”. This is at least an indicator that YouTube is not the go to place for pornography. There is undoubtedly hidden pornographic content in YouTube if you search hard enough, but why bother when any child with free and unsupervised access to the internet can access this material from a normal search engine. This doesn’t only apply to porn, but to anything that might endanger the safety of a child, extremism, hate, disturbing images of torture, war and suicide to name a few.

So, if a child can readily access adult and offensive content elsewhere, does this mean YouTube were harshly treated last year? We say they weren’t, because they have a responsibility and obligation to do their best to ensure they have the best processes in place to prevent young people from accessing inappropriate content. The evidence suggests that they have failed to do that, so it could be argued that the media criticism was justified. However, they are not the only media site struggling with this issue of predators using and abusing online platforms to prey on the vulnerable. Social media giant Facebook has also faced the same problems, as have many others. YouTube came to the fore because it is one of the ‘giants’ and involved some very real and current issues, namely sexual and extremist exploitation. But as we have seen, a young person with access to the internet can easily find adult themed and extreme content without ever visiting the site.

So, it raises the question – is the real issue YouTube, or for that matter any other media provider? Or is the major safeguarding challenge really the World Wide Web? Any access to the internet can undoubtedly expose a young person to the depraved predators who are forever present within it. The web is a wonderful resource, one I could never have dreamed of in my childhood, but we need to be knowledgeable about its dangers, vigilant and as social media savvy as the young people we try to protect.

Share This Story

Leave A Comment

Get Involved!

Share Your own Safeguarding News and Research to reach a wider Audience

From Our Blog

Gang Involvement – Spotting the signs

We identify some indicators that point to a child’s involvement in gangs and also take a look at those young people that are more likely to be at risk of being drawn into gang culture.

The Safeguarding Hub

Share Your Safeguarding News And Research To Reach A Wider Audience