BRSSEL (dpa-AFX) – In the fight against child sexual abuse in the future, the Internet could be subjected to much more screening. According to the bill presented on Wednesday, providers such as Google and Facebook may be required to use the software to search for relevant statements on their websites. In addition, an EU center is to be established, which, inter alia, should provide the appropriate technology. “We’ll find you,” EU Commissioner for Home Affairs Ylva Johansson told Stratter.
The network is currently flooded with appropriate representations, and the problem is growing. According to the European Commission, 85 million photos and videos of child sexual abuse were reported worldwide in 2021. The number of unreported cases is much higher. The Internet Watch Foundation saw a 64% increase in reports of confirmed cases of child sexual abuse in 2021 compared to the previous year.
According to Johansson, the perpetrators are often people the child trusts. “And these crimes very often remain secret until the perpetrator publishes them on the Internet.” Often it was the photos and videos that made criminal prosecution possible. The fact that images of serious child sexual abuse are increasingly finding their way onto the Internet is also due to a culture of exchange between criminals. In order to obtain child pornography from other perpetrators, one requirement may be to live broadcast the rape of the child yourself.
Such extreme examples are just the tip of the iceberg. Johansson emphasized that a study was conducted from Sweden in which 80 percent of the 10 to 13-year-old girls surveyed said they had already received unwanted nude photos from unknown adults. – I think that I have the vast majority of citizens on my side – said the Swedish woman, referring to her bill.
Specifically, this means that companies need to analyze the risk of disseminating abuse images through their services or so-called grooming – i.e. when adults contact minors with the intention of abusing them. If it is determined that there is a serious risk, national authorities or courts can order the software to automatically check the content and detect criminal content.
The technology used for this purpose should not be able to extract any information other than that which indicates the dissemination of offensive material, the bill says. The same goes for grooming. The software should also be designed in such a way that it violates the privacy of users as little as possible.
The bill does not specify which technology is to be used. Consequently, it is also not clear how web content control would be technically implemented and whether, for example, message encryption could be circumvented. However, providers must in particular ensure that children cannot download applications that pose an increased risk of seduction, and that abusive images are removed or blocked. You should also know whether the account belongs to a minor or an adult.
The EU Parliament and EU countries now need to discuss the proposal and agree on the final version. So there may still be changes.
The first reactions were mixed. Federal Interior Minister Nancy Faeser (SPD) welcomed the proposal. “With a clear legal basis, binding reporting channels and a new EU center, we can significantly strengthen prevention and prosecution across the EU,” she said. “The fact that in the future we will require companies to recognize and report child sexual abuse is an important and late step in the fight against child abuse,” said European Parliament CDU / CSU national policy spokeswoman Lena. Director
In turn, MP Moritz Krner spoke about “Stasi 2.0”. He is afraid of interference in the private sphere of citizens. Konstantin von Notz from the Greens criticizes that private companies may be required to systematically scan private text, image and video content. “There are great doubts as to whether this is in line with existing European and German fundamental rights and the jurisprudence of the ECJ.” SPD MP Tiemo Wlken described the software designed to detect web content as a “horror filter”. The regulation tries to pretend that privacy and data protection are guaranteed. “The text is also impenetrable and confusing,” wrote Wlken on Twitter./mjm/DP/jha