The Finnish Presidency has recently changed the proposed draft of the ePR with regard to the processing of data for the purpose of detecting child imagery abuse. The amendment responded to strong disagreement with the previous proposal lead by a letter sent directly to Jean-Claude Juncker from jointly over 50 children rights NGOs.

The previous proposal contained in Article 6 was written in a way that would de facto prevent specialized software tools that track potential child abuse from operating. These tools are used extensively around the EU, and the sources claim that for example the Microsoft´s PhotoDNA has already helped to identify over 8.7 million illegal images containing child nudity. Their functioning is based on using hashing technologies which analyse the databases to search relevant data quicker.

If the previous wording would have been implemented, the providers of such detection tools would no longer be able to proactively detect the child sex abuse material and take corresponding measures. That would result in a situation where the images would be displayed on the websites until a user views them and decides to notify the admin or the competent authorities.

The new proposal allows the providers to continue processing electronic communications data for the purpose of locating and deleting material constituting child pornography on conditions set out in Article 29(3) of the ePR. What apparently went unnoticed is the fact that Article 29(3) is applicable only to the providers who are already detecting the child pornography and shall only apply for a limited period of time (not defined yet). The rationale behind such additional conditions that one might rightfully frown upon remains unexplained by the Finland's Presidency.

Contributor: Natalie Tumova

Additional sources:


©2018 eprivacy tracker. Powered by PIERSTONE.

logo (1).png