Facebook's Encryption Plan Will Hide Online Child Sexual Exploitation
As the world’s largest social media company – and the largest source of reported child sex abuse online – Facebook’s actions have a major impact on global child safety. A resubmitted shareholder resolution seeks a report from Facebook that will assess the risk of increased child sexual exploitation that will occur if it implements a plan to offer end-to-end encryption on its platforms.
Online child sexual exploitation and child sexual abuse material (CSAM) is an escalating threat to children worldwide. The exponential growth of CSAM is directly tied to the growth of social media and the increasing number of children online. In 2020, there were more than 21.7 million reports of CSAM containing 65.4 million images and videos. More than 20.3 million reports – 94 percent – stem from Facebook and its platforms, including Messenger and Instagram. This represents an increase of 28 percent from Facebook’s nearly 17 million reports in 2019.
Facebook’s plan to apply end-to-end encryption to these platforms has set off a storm of controversy and criticism. Government agencies, law enforcement, and child protection organizations worldwide claim that it will cloak the actions of child predators, make children more vulnerable, and mean that millions of CSAM incidents will go unreported. Law enforcement will be able to locate neither the victims appearing online nor the perpetrators. The National Center for Missing and Exploited Children (NCMEC) estimates that Facebook encryption plans could effectively make invisible 70 percent of CSAM cases that are currently being detected and reported.
Monika Bickert, Facebook’s head of global policy management, testified at a recent hearing in the British House of Commons. In response to a question about how many CSAM cases would “disappear” if the company implements end-to-end encryption, she said, “I don’t know the answer to that. I would expect the numbers to go down. If content is being shared and we don’t have access to that content, if it’s content we cannot see then it’s content we cannot report.”
The proponents of the shareholder resolution are not opposed to encryption, but believe that Facebook should apply this technology in a way that will not pose additional threats to children from sexual predators. Everyone recognizes that privacy is important, but it should not come at the expense of unleashing a torrent of virtually undetectable CSAM on Facebook.
Facebook touts its leadership on this issue, yet its tools, content moderators, and AI have not kept child sex abuse imagery, live streaming, and videos off its unencrypted platforms. One can only imagine how much worse it will be when those channels “go blind” and mask the content from the company’s eyes.
Facebook highlights its work with law enforcement and NGOs but fails to acknowledge that law enforcement and NGOs are among its fiercest critics on how it has responded to this crisis. Facebook has also lobbied for the defeat of numerous bills that sought or currently seek to protect children from sexual abuse online.
If Facebook really wants to protect privacy, it can start by protecting the privacy of the most vulnerable – children.
Michael Passoff
CEO, Proxy Impact