The US, UK, Canada, Australia and New Zealand have released guidelines for tech companies who want to step up their efforts in fighting the child sexual abuse and exploitation online. The guidelines, or voluntary principles, were developed in consultation with leading companies such as Google, Microsoft, Facebook, Twitter, Snapchat and Roblox.
The amount of online child sexual abuse imagery online is rapidly growing. In 2019, 69.1 million online photos and videos of child sexual abuse were reported in the US. Though some tech companies have created some policies and procedures to combat these abusive materials from spreading on their platforms, the numbers show that there is still room for improvement.
Tech companies could do much more to stop the millions of online photos and videos of children from being uploaded and shared on their platforms.
To stop the spread, tech companies and governments have released a set of voluntary international standards on how to stop the spread of child sex abuse material and live streaming and processes to detect cases of grooming better. In the US, the Tech Coalition, a private sector partnership dedicated to stopping online child sexual exploitation, supports the principles and will use them to raise awareness with their members. Domestic legislation called the EARN IT Act was also tabled in the US and if adopted, it will remove the legal protections of tech companies that don’t act effectively to detect and take down online child sexual abuse material.
The guidelines say there is still work to be done
Though some of the larger companies have some systems in place for tracing and flagging existing images or videos that violate their policies, the new guidelines call upon companies to become more efficient and increase their efforts.
Tech companies should increase their efforts in preventing child sexual abuse, such as live-streaming acts of abuse on their platforms or services.
The guidelines also ask companies to report all content related to sexual crimes against children to authorities, which is done unevenly across companies. While in 2019, Facebook made more than 15,000,000 reports to the US National CyberTipline, Google barely reported 449,283, Dropbox reported 5,113, Grindr 13, and Craigslist 11.
Tech companies and encryption
A major topic that isn’t openly addressed by the guidelines is end-to-end encryption, a function that protects the user’s privacy by coding messages. Something many social media and messaging apps are already using to protect user privacy. On the one hand, encryption can help prevent hackers and criminals from illegally accessing user data. But on the other hand, end to end encrypted communities create safe havens allowing offenders to sexually exploit children while strongly limiting the possibilities of detection.
Companies should take a more child-focused approach when they design functions to protect users’ privacy and prioritise the detection of illegal content, such as child sexual abuse material.
What do the guidelines say?
There are 11 principles in total. Recommended actions are categorised by different themes that aim to tackle different sides to the problem on different platforms. Here is what a few guidelines are encouraging tech companies to do:
- Remove all known child sexual abuse material available on their platforms;
- Improve tools to find more child sexual abuse material and report those to authorities;
- Crackdown on advertising to recruit or solicit children for exploitation and abuse on their platforms;
- Seek, identify and stop offenders from using live-streaming services to sexually exploit children;
- Take a global approach by sharing insights, data and best practices with other companies.
Fighting online child sexual exploitation is complex, but with strong collaboration between tech companies and authorities, we make a bigger impact. The release of these voluntary principles is a step in the right direction.