A United Kingdom charity says that Facebook apps are used in over 5,000 child grooming crimes in the region. Records show that police in the UK have recorded over 24 grooming crimes in the UK weekly since 2017 and the offenders made use of Facebook-owned platforms.
UK Charity Says Facebook Apps Are Used In Over 5,000 Child Grooming Crimes
A charity in the UK just recently is pressuring the social media giant, Facebook to disclose the content of its internal research on incidents leading to child abuse. This is coming after the charity discovered that the UK police recorded over 5,000 (5,120 precisely) child grooming crimes on Facebook-owned sites and platforms since 2017.
The NSPCC, a children’s charity organization on Monday published research that shows that 53% of all online grooming crimes took place on Facebook-owned apps and websites including WhatsApp and Instagram. The research also noted that the incidents amounted to 24 incidents weekly.
How the Information Was Obtained By the NSPCC
The information and figures were obtained by the NSPCC through the freedom of information requests to police forces all over England and wales in regards to sexual communication with child offenses, which have been defined in the UK since 2017 by law.
With the number of grooming cases that of unreported in the region, the charity organization said that it believes the figures are “just the tip of the iceberg,” in a press release. With the publication, Facebook seems to be in more trouble and scrutiny. And whistleblower Frances Haugen is due to provide evidence to a UK parliamentary committee to analyze the nation’s online safety bill on the same day the publication was released.
What Is the Online Safety Bill
The online safety bill was previously known as the online harms bill. This bill is an important part of legislation that would enable the UK media watchdog Ofcom to be in charge of regulating social media platforms in the region in a bid to keep users safe. The platform Ofcom will have the power to charge and fine tech companies $25.3 million or 10% of their revenue if they fail to remove harmful and illegal content from their platform.
They also have the power to block websites and services. Senior managers of tech companies could also face criminal charges if their companies fail to comply with the obligations of the platform. Also, the publication of the NSPCC research coincided with reports on a cache of leaked internal documents of Facebook known as Facebook papers.
The NSPCC’s letter to CEO Mark Zuckerberg
With the popularity of the social media platform Facebook, it is very difficult not to be mentioned when it comes to being drawn into discussions that tend to the misuse of digital platforms. The NSPCC and almost 60 other global child protection organizations last week wrote to the CEO of Facebook Mark Zuckerberg. In the post, the CEO was urged to publish internal research of the number of abusers that may be making use of the company’s platforms to harm kids and teens.
The chief executive of NSPCC, Peter Wanless in a statement said “instead of scribing defensive blogs and setting their PR machine on journalists, (Facebook executive) Nick Clegg and Mark Zuckerberg must now publish all their research into how their platforms contribute to harm and sexual abuse and step up their efforts to fix their sites so they are safe for children.”
Facebook’s Response to the NSPCC
A spokeswoman to Facebook said that the social media company is committed to people on its platforms safe. She also cited that the company has spent $13 billion in recent years on the building and establishing of safety tools. She said in a statement that “we’ve shared more information with researchers and academics than any other platform and we will find ways to allow external researchers more access to our data in a way that respects people’s privacy.