The fight against the dissemination of recordings of abused children must be stepped up, as evidenced not only by the horrific incidents in Bergisch-Gladbach and Lügde. Investigators are overwhelmed by the amount of photos and videos. But since hardly any crime is as forbidden as child abuse, anyone who criticizes individual measures to combat it becomes unpopular. But it is time for the European Parliament and civil society to become unpopular and to oppose the new plans of the European Commission. Because they misused the target of catching the perpetrators in order to throw the new surveillance network onto all computers and cell phones.
The new regulations will oblige technology companies to check the private communication of their users in emails and on chat platforms and applications such as Facebook, Whatsapp or Signal. They should identify fraudulent images and report them to the new EU body. If they fail to do so, they face severe penalties. It is enough for tech companies to set up a surveillance system that collects traces of data from people in order to show them relevant ads. Now politicians want to add a state wiretapping system to this commercial system.
Matthew Green, one of the world’s leading encryption researchers, describes plans as “the most sophisticated mass surveillance machine ever installed outside of China and the USSR.” Even if it was a polemic, the project certainly goes beyond some of Snowden’s revelations about spying on the NSA’s tech companies in 2013. Companies had to forward messages to secret services from specific e-mail addresses. But it should be in the EU now everyone message enabled everyone scanning device.
Of course, automated, adaptive systems are to be used here. This technique is suitable for finding images that are abusive. But the consequences are unpredictable. Compulsory scanning would fail due to end-to-end encryption that secures Whatsapp chats, for example. The regulations would therefore force companies to spoil the best form of encryption to date. This would not only weaken it on the perpetrators’ mobile phones, but also make it impossible for all users to use it. Absurdly, the EU is forcing companies to protect their users’ privacy with rules like the General Data Protection Regulation. Messenger and email are not the tools of choice for the perpetrators anyway. The scene uploads files to websites or runs directly on a dark web that is difficult to access.
In addition, the technology of recognizing images based on their digital fingerprint is prone to errors. Since it lacks a sense of the context and detail of the images, it will report hundreds of thousands of false positives. Before it is noticed, private – and perfectly legal – photos are already in the hands of an EU body. And who guarantees that the software can recognize if the person in the photo is still a minor or just turned 18 and that it is a crime? The new system would also mean that the agency would have a dedicated line for the intimate life of young people.
The planned use of automated systems becomes even more absurd when it comes to: care goes, perpetrators approach to future victims in chat. Businesses should recognize this automatically as well. Technology that recognizes such complex patterns in language and doesn’t catch innocent people simply doesn’t exist. The EU foresees science fiction that will soon emerge in a reality full of false suspicions.
Automation by software is sure to change society, but if politicians try to automate social problems, it will fail. Apparently, the Commission was convinced that the software could avoid investing in well-trained officials supported by psychologists. Money for such authorities and technology that specifically monitors individual suspects is more effective than replacing controversial companies with deputy sheriffs.