New tech will ‘revolutionise’ international approach to tackling online child sexual abuse
There has been no progress in agreeing a universal legal standard for classifying abuse, so developers at the IWF took a different approach.
The Internet Watch Foundation’s new IntelliGrade tool will “revolutionise” the international approach to tracking down and tackling child sexual abuse material on the internet. Online child sexual abuse is a global threat, but those fighting it have to deal with a web of different legal frameworks across the world. When an analyst in the UK classifies illegal material, they grade the content’s severity according to the UK’s Category A, B, or C system, with Category A being the most severe. The categories correspond to UK law and sentencing guidelines for child sexual abuse.
The images are classified and then “hashed” into a digital fingerprint, allowing them to be identified, blocked, and removed by the IWF’s partners around the world. Analysts in other countries do the same, but must grade the images according to the different legal standards of the country they are working in.
Images and videos classified as criminal in one country may not meet another’s thresholds, or may fall under a different classification altogether. This creates obstacles and wastes valuable time, as illegal content already identified cannot be easily passed on. While there have been calls to unify laws in different countries to make the process smoother there has, so far, been no progress in agreeing a universal legal standard for classifying abuse. So developers at the IWF took a different approach.
The Internet Watch Foundation’s breakthrough new hash grading technology, dubbed IntelliGrade is now making it possible for an image graded and classified in one country to seamlessly fit into another country’s system, filling these gaps in legal harmonisation. The new technology, conceived, developed, and built by the IWF, is making it possible for agencies in other countries to take hashes from the UK and apply them seamlessly into their own systems, allowing swift and appropriate action without the need for additional assessment.
IntelliGrade makes use of augmented metadata inputted by analysts who include, in granular detail, details of the abuse in the material, as well as details about the victims and perpetrators. This allows it to automatically match up images and videos to the rules and laws of Australia, Canada, New Zealand, the US, and the UK, known as the Five Eyes countries.
“This has the potential to have a huge impact on child safety. It means material which is known and classified in one country can instantly be known and classified for other countries, meaning more can be removed from the internet, more quickly. It will revolutionise the way this is approached.” - Susie Hargreaves OBE, Chief Executive of the IWF
The public is given this advice when making a report:
- Do report images and videos of child sexual abuse to the IWF to be removed. Reports to the IWF are anonymous.
- Do provide the exact URL where child sexual abuse images are located.
- Don’t report other harmful content – you can find details of other agencies to report to on the IWF’s website.
- Do report to the police if you are concerned about a child’s welfare.
- Do report only once for each web address – or URL. Repeat reporting of the same URL isn’t needed and wastes analysts’ time.
- Do report non-photographic visual depictions of the sexual abuse of children, such as computer-generated images. Anything of this nature, which is also hosted in the UK, the IWF can get removed.
If you'd like to learn more about topics like this, then
click here to sign up for INHOPE Insights and Events.