31 May 2013

How to curb the spread of indecent images online

In the wake of two horrific cases of child murder, pressure is growing on internet service providers such as Google to do more to crack down on indecent images online.

Mark Bridger, the killer of April Jones, was discovered to have child pornography on his computer. Stuart Hazell, who murdered Tia Sharp, also stockpiled images of child sexual abuse.

There is a sense that the internet’s chaotic and exponential growth makes preventing such behaviour almost impossible, and that it shortens the timeline between unhealthy interest and commission of actual crime.

In fact, child sexual abuse is one of the few things about which there’s almost universal condemnation online. Even in the moral vacuum of the computer hacker community, paedophilia is looked on with disdain.

Internet companies already work to stop the spread of these images. But the problem is that the system for doing so might have unintended consequences.

Take Facebook, for example: how on earth does a website which sees 2.5bn photos uploaded every month spot the tiny proportion that are images of child sexual abuse?

Under the current system, bodies like the Internet Watch Foundation carry databases of known abuse images. Websites such as Facebook, for example, can automatically compare the images being uploaded for a match on a database.

The immediate problem with this is speed with which the internet grows – 150,000 new websites per day, each of which is a potential new hiding place for paedophile material.

But the deeper problem is that this system risks driving up the premium on the creation of new images of child sexual abuse. If a paedophile has access to an image that’s not on the database and therefore won’t be flagged, that image commands a higher value.

The hope is that, as systems for finding and logging child sexual abuse images get faster and more effective, the community attempting to share those images will shrink, countering this unintended inflationary effect.

There is another point to consider; one which spreads the responsibility beyond the technology companies and policy makers.

During Bridger’s trial it emerged that he had also gathered images of local children from social networks such as Facebook. This compounded the community’s sense of violation, and throws up important lessons for all internet users.

The web is, at its heart, a publishing tool. By going online, you are going public. Any attempt to lock down content thereafter is simply tinkering at the edges.

Despite your best efforts to limit who can see a picture of your child, all it takes is a “right-click-save-picture” and that image is no longer under your control.

The mundane truth is that, just as internet companies need to work harder at preventing the spread of images of child sexual abuse, internet users must change behaviour too.

Read the terms and conditions, understand and make use of websites’ privacy settings, and think twice before hitting the upload button.

Follow @GeoffWhite247 on Twitter