Working with Hany Farid, an American university professor specialised in the analysis of digital images, Microsoft announced PhotoDNA in 2009. It’s a smart tool. If still pictures of child sexual abuse material are uploaded or stored on your internet platform PhotoDNA lets you find them quickly without human eyes having to look at them again. Companies can move swiftly to delete the images and report them to the police, starting a search for the child victim and the perpetrators.
Later similar tools emerged which could do the same for videos. “Classifiers” also became available. These could spot stills or videos which had not yet been classified as illegal but were likely to be. They would be flagged for human review. Different companies also developed tools which could spot children being groomed for sex, being bullied or threatening to commit an act of self-harm.
The beauty of all of the above is online businesses worked out ways of setting these tools to work proactively. Last year NCMEC received reports of nearly 70 million pictures or videos of child sexual abuse. Last week, for the first time NCMEC released new numbers showing how many of these were being uploaded from EU Member States. 3 million last year, 2.3 million so far this year. 1,020 reports of grooming were also received. These led to action to save one or more children in Belgium, France, Germany, Hungary, The Netherlands and Poland.
Why does this matter? Because, unbelievably, there is a risk that on 20th December 2020 the EU may make all the tools illegal. How this happened is a story of bureaucratic incompetence, oversights and mistakes made by several different EU institutions.
A proposal has been tabled to preserve the status quo until the mess can be sorted out. Unbelievably this is being opposed by some members of the European Parliament who see the tools as a threat to privacy.
A leading privacy advocate in the argument managed to discuss the issue without once mentioning the right to privacy of the children pictured in the images. Neither did he mention a child’s right to human dignity, which in this case entails getting images of their hurt and humiliation removed from public view as quickly as possible and to the greatest extent possible. Privacy laws were never meant to make it difficult to locate and delete pictures of a child being raped. They were not meant to make it easier for an offender grooming a child for sexual purposes. And the tools that are being used pose no threat at all to anybody’s privacy. They do not read, analyse, record or comprehend any of the messages. They scan looking for signs of criminal content or behaviour. No sign, no action. Nothing happens. Think about sniffer dogs at an airport!
If the tools do detect an illegal image it is referred to the appropriate authorities for further action! If they detect grooming behaviour or a potential new image it is flagged to a human moderator to make sure the context of the message or image has been properly interpreted.
We have to stop the tools from being outlawed. To do that you have let politicians know how you feel. Get as many people as possible to sign the petition shown below. And if you live in an EU Member State please write to your Members of Parliament and national Government. Important votes will take place in early December. Please act now.
Senior Advisor, ECPAT International