A European Commission proposal could force tech companies to scan private messages for child sexual abuse material (CSAM) and evidence of grooming, even when those messages are supposed to be protected by end-to-end encryption.
Online services that receive “detection orders” under the pending European Union legislation would have “obligations concerning the detection, reporting, removal and blocking of known and new child sexual abuse material, as well as solicitation of children, regardless of the technology used in the online exchanges,” the proposal says. The plan calls end-to-end encryption an important security tool but essentially orders companies to break that end-to-end encryption by whatever technological means necessary:
In order to ensure the effectiveness of those measures, allow for tailored solutions, remain technologically neutral, and avoid circumvention of the detection obligations, those measures should be taken regardless of the technologies used by the providers concerned in connection to the provision of their services. Therefore, this Regulation leaves to the provider concerned the choice of the technologies to be operated to comply effectively with detection orders and should not be understood as incentivising or disincentivising the use of any given technology, provided that the technologies and accompanying measures meet the requirements of this Regulation.
That includes the use of end-to-end encryption technology, which is an important tool to guarantee the security and confidentiality of the communications of users, including those of children. When executing the detection order, providers should take all available safeguard measures to ensure that the technologies employed by them cannot be used by them or their employees for purposes other than compliance with this Regulation, nor by third parties, and thus to avoid undermining the security and confidentiality of the communications of users.
A questions-and-answers document describing the plan emphasizes the importance of scanning end-to-end encrypted messages. “NCMEC [National Center for Missing and Exploited Children] estimates that more than half of its CyberTipline reports will vanish with end-to-end encryption, leaving abuse undetected, unless providers take measures to protect children and their privacy also on end-to-end encrypted services,” it says.
“Do the impossible, you get to decide how”
“It really looks like the European Commission wants to cancel encryption,” said a post by Bits of Freedom, a Dutch digital rights foundation. The proposal “will force companies to monitor what people share with each other via chat apps like WhatsApp and platforms like Instagram,” Bits of Freedom policy adviser Rejo Zenger wrote. “If deemed necessary, platforms will be forced to delete information or report it to the authorities. Internet service providers can also be ordered to monitor their customers’ Internet traffic. But the Commission omits, quite cleverly, depending on where you’re standing, just how they should do so. Effectively [the] message for companies is: ‘Do the impossible, you get to decide how.'”
An EC announcement said the problem of CSAM has gotten out of hand and that the current “voluntary” system isn’t enough. “With 85 million pictures and videos depicting child sexual abuse reported worldwide in 2021 alone, and many more going unreported, child sexual abuse is pervasive,” the announcement said. “The COVID-19 pandemic has exacerbated the issue, with the Internet Watch foundation noting a 64 percent increase in reports of confirmed child sexual abuse in 2021 compared to the previous year. The current system based on voluntary detection and reporting by companies has proven to be insufficient to adequately protect children.”
The proposal’s detection orders would be “issued by courts or independent national authorities,” the announcement said. A detection order would be “limited in time, targeting a specific type of content on a specific service,” and instruct the company receiving the order to scan “for known or new child sexual abuse material or grooming.” Grooming means “solicitation of children,” the announcement said.
Other parts of the proposal “require app stores to ensure that children cannot download apps that may expose them to a high risk of solicitation of children.” Additionally, “providers that have detected online child sexual abuse will have to report it to the EU Centre,” and “national authorities can issue removal orders if the child sexual abuse material is not swiftly taken down. Internet access providers will also be required to disable access to images and videos that cannot be taken down, e.g., because they are hosted outside the EU in non-cooperative jurisdictions.”
“War upon end-to-end encryption”
Scanning the content of private messages shouldn’t be possible with encryption that is truly end to end. As Proton Mail explains, “E2EE [end-to-end encryption] eliminates this possibility because the service provider does not actually possess the decryption key. Because of this, E2EE is much stronger than standard encryption.”
The European proposal was criticized by security experts including Alec Muffett, a network security researcher who—among other things—led the team that added end-to-end encryption to Facebook Messenger. “In case you missed it, today is the day that the European Union declares war upon end-to-end encryption, and demands access to every person’s private messages on any platform in the name of protecting children,” Muffett wrote.
In 2018, Facebook explained “that end-to-end encryption is used in all WhatsApp conversations and can be opted into in Messenger. End-to-end encrypted messages are secured with a lock, and only the sender and recipient have the special key needed to unlock and read them. For added protection, every message you send has its own unique lock and key. No one can intercept the communications.”