Share the knowledge:

Welcome to our first quarterly online safeguarding news update. For the last 3 months we have monitored the latest online safeguarding news stories, and in this article, we provide you with a bitesize round up of the most relevant news stories. Hope you enjoy reading.

Late December – Thousands of online abusers in internet chat rooms. Towards the end of the year Chief Constable Simon Bailey, national police lead for Child Protection warned that there were thousands of child abusers on internet chat rooms. He quoted figures of 4,000 men found to be using a single online chat room that year (2017), saying that across Britain the number of online abusers ran into “tens of thousands”. He also said a lack of resources meant police could not tackle all offenders, and officers had to prioritise the most serious threats posed by perpetrators. He called on internet companies to do more and help stop access to sexual abuse images and videos on their platforms.

Twitch has become too sexual– There were calls for the live video streaming service ‘Twitch’ to put its house in order over it’s Real Life (IRL) section of its streaming site. Twitch is an Amazon run company that focusses on broadcasting people playing computer games and allows its users to broadcast their daily activities. It is estimated that it has 15 million users per day. The problems with Twitch are that it has become increasingly flooded with young women streaming overtly sexualised behaviour – revealing clothing, sexualised dancing etc. This has prompted complaints to Twitch who have taken some action against some of their streamers. A site not suitable for children.

4th January – ‘Life in likes’. The Children’s Commissioner’s produced a report ‘Life in Likes’ exploring the effects of social media on the wellbeing of 8-to-12-year-olds. One of the key findings was that “many Year 7 children are finding social media hard to manage and becoming over-dependent on ‘likes’ and ‘comments’ for social validation” and “they are also adapting their offline behaviour to fit an online image and becoming increasingly anxious about ‘keeping up appearances’ as they get older”.

You can read the full report here: Life in ‘likes’ – Children’s Commissioner report into social media use among 8-12 year olds.

4th January –  Hacking children’s Smart Toys.  Following the Christmas period there were several news stories warning parents about the dangers posed by smart toys, in effect any device that can connect to the internet. The warning advised that poorly secured gifts could be vulnerable to hacking. In Germany their telecoms regulator banned the sale of Smartwatches to children as well as the “My Friend Cayla” toy. The regulator claimed that some smart toys could lead to a child’s location being tracked. The UK consumer group Which also highlighted ta number of toys that might be at risk – Furby Connect, I-Que Intelligent Robot, Toy-fi Teddy and CloudPets. The advice to parents – check that the device has in-built safeguarding tools.

15th January – Google apps feature pornographic adverts to children. Google was forced to remove 60 apps from its Play Store after many featured pornographic adverts targeting children. The apps had been infected with malicious code called “AdultSwine”. As well as featuring inappropriate ads, the code attempted to trick users into installing fake security apps and then register for expensive services. Google acted quickly to remove the malicious apps and shut down the accounts of the developers.

29th January – Social media giants urged to do more to crack down on grooming. The NSPCC called for the government to force social media companies to do more to “flag up potential abuse”. The charity attacked the existing algorithms system used by large social media platforms to identity inappropriate and illegal content, stating that existing frameworks didn’t go far enough. The NSPCC said it wanted to see a mandatory code put in place across social networks to tackle grooming, calling for this to be overseen by an independent regulator. They criticised the government for failing to implement proposals in a government commissioned report 10 years earlier – ‘Safer Children in a Digital World’. The NSPCC claimed that 11 of the 38 proposals were ignored, 7 were partially implemented and 4 were now out of date.

30th January – Warning that Facebook is not a place for children.  More than 100 child health experts leant their name to an open letter to Facebook founder Mark Zuckerberg urging him to withdraw the app ‘Messenger Kids’. Aimed at children under 13 years it is an app that follows the principles of Facebook Messenger but with added security and requiring parental approval before use. They called the app “irresponsible” and said that it was designed to encourage young people to use Facebook. You can decide for yourself and read the Open Letter here. Messenger Kids is currently available in the US, but not yet released in the UK.

6th February – Apple ban their Telegram app after accusations that it features child abuse images. Apple temporarily withdrew its messaging app ‘Telegram’ from its app store after the discovery that sexual predators were using the app to share images of child pornography and abuse. Telegram is an encrypted messaging app that has a secret chat function where messages will delete after they are sent. The app has been criticised by the government following fears that it is used by criminals and extremists. If a young person who you care for has this app, then ask yourself why. Within hours of its removal the app was back in the ‘Play Store’ after the company apparently made some fixes to prevent the illegal content. Despite what Apple say, what they actually did was to enhance preventative measure so in our view there is no guarantee that this won’t happen again.

6th February – A warning to be vigilant over hidden conversations in online game Roblox. Roblox featured a few times in the online news sites over the first quarter of the year. The first story came from a mother whose 7-year old daughter was engaged in conversation with a much older person posing as a boy, who asked for the child’s telephone number. Roblox claims to be the world’s largest games platform. It has 30 million active users and is an online platform that hosts multiple games made and created by its users. There are all manner of games and users can build their own worlds and games using the platform. Players can choose which games they want to play and unlike Minecraft, you cannot play as a single player, or restrict games to friends only. This story throws up several safeguarding issues – appropriate age to be online gaming, unsupervised gaming etc.

9th February – The ‘Ikea 24 Hour Challenge’. Every now and again social media throws up a dangerous challenge. Most like ‘Game 72’ are fictional, a few are real. This one involved encouraging people to hide in large shops and warehouses overnight and then the following morning, sneak out undetected. This one was named after Ikea after that store was targeted in various countries with ‘players’ hiding in wardrobes. It was described in the media as an “online craze” with a “spate” of incidents. It wasn’t quite that, and we are not aware of many incidents in the UK.

13th February – Experts warned that YouTube’s dark side could impact on children’s mental health. Mental health experts from the American Academy of Paediatrics warned that fear-inducing videos could affect the development of the brain in young children. They advised parents to limit screen time. The story made reference to ‘Elsagate’, the issue affecting YouTube where well-known children’s videos are highjacked and inappropriate content is added. Children’s favourites targeted include, Peppa Pig, PAW Patrol, Spiderman and characters from Frozen, hence the name Elsagate. The experts warned that repeatedly viewing inappropriate content can have an adverse effect on the developing brain. Hardly rocket science this, but always worthy of a warning. A warning from us: These inappropriate videos have also appeared on ‘YouTube Kids, not just the main YouTube app/site. You can read more about the safeguarding issues affecting YouTube in our article ‘The trouble with YouTube’.

14th February – The drug Xanax available to young people on social media. A BBC investigation reported that the prescription drug Xanax is being sold illegally to children on social media sites. Xanax (Alprazolam) is a minor tranquiliser used to treat anxiety and must be prescribed by a medical professional in the UK. However, it is widely used and available in the US. It has a number of potential side effects and is highly addictive. The BBC quoted the charity Addaction who said that children as young as 13 had bought it online. The BBC identified adverts for the drug on social media platforms Instagram and Facebook, stating that several young people in Sussex needed hospital treatment after taking Xanax. The Home Office promised action.

21st February –  A warning that young people in the UK lack cyber-security awareness. A survey produced by the governments Cyber Aware campaign identified that 52% of Britons aged 18-25 use the same online password for various services. The average was 6 services per 1 password, although the report quoted as many as 21 services under a single password. The story came with a warning that this made it easier for online hackers and fraudsters to operate.

26th February –Sarahah app dropped following bullying accusations. An Australian woman, Katrina Collins, created a Change.org petition calling for the removal of the app Sarahah from Apple’s app store and Google’s Play store following the discovery of anonymous bullying messages her 13-year-old daughter had received. Sarahah is a messaging app which allows its users to send anonymous messages. Anybody who has a user’s link can send that user an anonymous message –  no account needed. Katrina’s petition soon gained close to 470,000 supporters and to Apple and Google’s credit, they removed the app from their stores. Is it still on the phone of a young person you are responsible for?

26th February – “Social media companies aren’t doing enough to tackle cyberbullying”. Still on the theme of cyberbullying, the Children’s Society accused social media firms of failing to tackle cyber-bullying, stating that this risks the mental health of young people. Releasing their Safety Net report, they revealed that almost half of the young people (11 to 25-year-olds) surveyed, had experienced threatening or nasty social media messages, emails or texts. You can read “Safety net: The impact of cyberbullying on children and young people’s mental health” here.

5th March – Facebook criticised for survey on child grooming rules. The media giant was heavily criticised for running a survey in which they asked users if the company should decide whether adult men could use the site to solicit sexual pictures from children. There were a number of questions including: “In thinking about an ideal world where you set Facebook’s policies, how would you handle the following, – a private message in which an adult man asks a 14-year-old girl for sexual pictures”. It wasn’t so much the questions that caused offence, more the lack of options in the answer fields they provided.

6th March – A third of victims of online child sex abuse are boys. Interpol and the charity ECPAT produced figures that revealed boys account for nearly a third of online child sex abuse images, often featuring the worse type of abuse. The reason the story focused on boys, was that the figure was higher than expected. The survey was obtained from 1 million online images and videos. Their report revealed that 64.8% of the unidentified victims were girls and 31.1% boys. 4.1% of the images featured both boys and girls.

10th March – Culture Secretary backs time limits for children using social media. Matt Hancock said that more must be done to safeguard young people and suggested introducing an age-verification system and imposing time limits for children using social media. He described how age-verification would place a legal requirement on social media companies to ensure users were over 13 years old. Mr Hancock could not expand further, saying that details were still being worked out.

 13th March – “Human Error” at YouTube. In front of a Home Affairs Committee, Google blamed human error for a failure to remove 4 propaganda videos posted by the banned UK Neo-Nazi group National Action, on its sister site YouTube. Google said that whilst the videos had flagged up on their systems, their review team made an error by deciding not to remove the material. The videos showed National Action speeches and demonstrations. Google assured the MP’s that in future, any videos featuring National Action supporters would be sent to specialist reviewers trained to recognise the group’s slogans. Remember, young people can see anything on YouTube, including those preaching hate.

15th March – Omega Labyrinth Z anime game banned in the UK.  PlayStation game Omega Labyrinth Z was banned from sale in the UK. Accused of promoting the sexualisation of children. The Video Standards Council refused to give it an age rating, without which it illegal to sell the game in the UK. Set in a girls’ school, Omega Labyrinth Z involves putting female characters through a series of challenges as they search for a holy artefact in caves under the academy. The game was the first to be banned in the UK since 2008. Watch out for young people buying the game illegally online from companies outside the UK.

16th March – Facebook in trouble again. Facebook was forced to apologise after a fault in its search function suggested that users might be interested in child sexual abuse videos. Some users typing in the search field were given a predictive option suggesting that the user might be looking for a videos of girls performing sex acts. Facebook said that as soon as they became aware of the offensive predictions they removed them.

Mid-March onwards – Facebook under the spotlight. In early March the Cambridge Analytica scandal story broke. It was revealed that data held by the company on 50 million people had been improperly accessed by a political consultancy company, who allegedly used it to target voters to influence the 2016 US presidential election. There were also accusations that Facebook hosted stolen identities and social security numbers. Facebook shares plummeted and there were calls for Mark Zuckerberg to appear before MP’s.  Whilst the scandal was not necessarily a safeguarding matter, it did throw the issues of how much personal data Facebook holds on its users, information that it then passes onto 3rd parties.

This is nothing new, for Facebook has been sharing your personal information for years. You have almost certainly agreed to allow them to do this when you ticked the terms of conditions box, after all, how many of us actually read the T&C’s page. If you put your life out there on Facebook, then there is a fair chance it will be sold on to other commercial companies who will use it to target and manipulate you commercially. Why you may not be personally bothered about this, have a think about those children and young people you are responsible for caring or safeguarding. Do you need to check their privacy settings? For details on how to protect your privacy, Money Saving Expert Martin Lewis’s website have produced handy guides for Facebook, Instagram and Twitter. You can find the guide for Facebook here and Twitter and Instagram here. 

Later in the month Facebook said it would overhaul its privacy tools and make it easier for people to find and edit the personal information the company holds.

23rd March – Craigslist drops dating ads after new law. In the US, Craigslist the classified ads website decided to close down its dating ads section following the introduction of a new US law called ‘Allow States and Victims to Fight Online Sex Trafficking Act (Fosta)’. This law effectively permits the prosecution of websites that allow adverts that are a cover for prostitution, sex trafficking and child abuse. Craigalist has had a problem with this in the past and therefore the company decided to withdraw their dating site, saying it was too much of a risk to keep it open. Shortly after the social network Reddit followed suit and dropped its escorts message board. Warning: we don’t have this law and Craigalist is available in Apple and Google Play stores here in the UK.

26th March – Police Scotland reveal that nearly 25% of their registered sex offenders, committed their crimes online. Police Scotland released a campaign warning abusers that they would be found and arrested. Part of their campaign was to target perpetrators who use social media to offend. They used the slogan ” You’re one click away from losing everything.”.

29th March – Stalking tool on WhatsApp. The month of March ended with WhatsApp being accused of allowing a new ‘spying’ tool. We haven’t had a time to research this fully, but it appears that a new app called ChatWatch, which claims to be able to use WhatsApp’s public online and offline status to monitor other user’s activity. For a small fee per week, the app allows the user to monitor two numbers of other contacts i.e. friend, family member, work colleague or an associate. It alerts the user when the two numbers are active, monitors the users online chat timeline (not content), compares the patterns and produces probability data as to when the two numbers are likely to talk or had talked to each other. For extra money you can apparently monitor up to 10 phones. We will let you decide who might use this app and for what purpose.

Thanks for reading


Safeguarding Hub

Safeguarding Hub

The Safeguarding Hub has been developed by Andy Passingham and Paul Maslin as a way of sharing information relating to safeguarding children and vulnerable adults. This website and the articles produced by Andy and Paul have been created in their own time outside of their current police roles.

Share the knowledge:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.