The Momo Challenge, a disturbing WhatsApp suicide game, which recently went viral is spreading alarm among parents.
There have been reports that children are being invited to play the suicide games online, with many lured via social media to carry out dangerous and violent activities.
The game allegedly targets children and young people.
A UK children’s charity has also accused YouTube of failing to remove clips that depict suicide on YouTube Kids. The National Society for the Prevention of Cruelty to Children has also revealed clips of a YouTuber demonstrating a suicide method in children content.
Google on behalf of its company, YouTube has said they are doing their best to sensor such sensitive contents. It explained that it relies on user-flagging and smart-detection technology to flag contents.
Parents have also been warned to be very careful.
Recently, a children’s cartoon on YouTube, Splatoon, featured a clip by YouTube Prankster, Filthy Frank. In the video that appears, about 4:41 minute into the clip, a young man in glasses appears, telling children to ‘end it’.
The video has since been taken down after getting flagged by the media and YouTube has been advised to be more critical of videos uploaded in the future.
This is not the first of such occurrences.
In 2017, a popular British cartoon, Peppa Pig was edited to show distressing scenes involving s*x and violence.
Pedophiles have also been using the platform to molest kids by leaving suggestive comments on videos of children doing innocent physical activities.
Meanwhile, illegal images of child s****l abuse and stolen credit card data are being traded on encrypted apps such as Telegram.
YouTube Kids is an app by the video streaming service targeted towards children, with parental control features and video filters.