The Home Affairs Committee today, Monday 7 December 2020, voted in favour of the temporary continuation of certain scanning technologies for detecting online child sexual abuse. This vote was necessary as these activities could probably soon no longer work without the consent of the users, due to a change in the so-called Electronic Communications Code, which will apply from December. After the Commission presented its proposal for the temporary solution in September 2020, I, as the European Parliament’s rapporteur for the dossier, have led the preparations for today’s vote. The next step are now the so-called trialogues, the joint negotiations with the Commission and the Council. My statement:
Child sexual abuse is a horrible crime and we have to get better in preventing it, in prosecuting the offenders and in assisting the survivors. This includes better preventing the dissemination of online child sexual abuse material. As lawmakers, it is our duty to assess and scrutinize all the facts and respect the law when legislating for the most vulnerable in our societies. Unfortunately, the Commission has failed in providing these facts when presenting last minute their proposal for a continuation of certain scanning technologies for detecting online child sexual abuse.
The Commission claims that this proposal is about the continuation of existing scanning technologies. This is certainly true for the use of so-called hashing technology: This technology has been in use for several years and allows for the identification of child sexual abuse material by comparing videos or pictures depicting children with a pre-defined set of digital signatures or so called hashes identified mostly by a US bases private entity as child sexual abuse material. Once the US side have identified suspicious material, they forward this to EU law enforcement authorities which eventually have to decide if the material constitutes child sexual abuse material. Numerous questions remain around this technology. These include the question what legal basis companies use when sending personal data – including the pictures of minors – to the USA and how long these data are stored there, who gets access to them and what happens to a picture in the USA that is not child sexual abuse material, but for example an innocent family picture. Yet, the European Parliament does not wish to interrupt this practice, but urges instead to provide for additional safeguards for using this technology.
The more difficult part of the Commission proposal is certainly the idea of allowing for the use of so-called anti-grooming technology. Grooming, the solicitation of children for sexual purposes, is without question horrendous. However, in contrast to hashing technology, anti-grooming technologies read the full content of every message and every email of every user in order to detect suspicious patterns of a behaviour that is not necessarily illegal under the law in all Member states. When asked, the Commission itself could not explain where in the EU the content of every message and every email of all users is read by companies and how this is legal under the EU Charter of Fundamental Rights. The protection of privacy is a fundamental right and a prerequisite for the protection of children as well, for example, when it comes to communication between a victim of abuse and their doctor or their lawyer.
The Commission will have to do much better when presenting their proposal for a permanent legislative framework for the use of these technologies in the second quarter of 2021, starting with a thorough impact assessment, as it would already have been obliged to present for this interim solution. There have been enough time over the past two years to do this.
As for the interim solution, we now need to start trilogues as soon as possible in order to clarify open questions, for example on the indiscriminate reading of emails in the EU. We need to find a legally sound solution that is also capable of withstanding possible review by a court.