The industry-funded watchdog for child sex images online will now have the power to seek out such content, rather than only following up leads from the public who report it.

The Internet Watch Foundation told Channel 4 News on Tuesday that this is the measure they wanted most – enabling it to take the fight against child abuse to those who upload the images themselves.

At the moment, it’s up to police to proactively seek out offenders. In future, the IWF will be able to do the same, but will still rely on the police, via the child exploitation and online protection agency, for the permission to get content taken down.

Pre News refresh player


brightcove.createExperiences();


Google, Facebook and Microsoft agreed the new approach for tackling online child abuse images, after a meeting with Culture Secretary Maria Miller, and Google has pledged more money to help the IWF.

However, with millions of abusive images now circulating on the internet, the risk is that today’s move will be seen as trying to stop a tidal wave by building sea defences a metre higher.

There are a number of problems the IWFs new powers will not solve:

  • With only 20 staff currently, how much can the IWF do?
  • The IWF cannot tackle so-called “peer to peer” sharing of images, by which owners of child sex abuse content share them directly. (That’s why, when you search words related to child sex abuse, you see listings for file-sharing websites
  • Many offenders use the so-called “darkweb” (sites accessible only using special software, not indexed by search engines like Google). Child abuse images hidden on these sites are likely to be beyond the IWFs grasp even with these new powers.
  • There is still no compulsion for companies like Facebook to prevent such images being uploaded (the company does vet images currently, using Microsoft’s PhotoDNA software, but that’s a voluntary move)
  • There is no compulsion for Google to strip out sites from its search results which offer child sex abuse images (Google has announced software similar to PhotoDNA, which will allow firms to share abusive images, but again, this is a voluntary move).
  • None of this tackles children’s access to legal pornography, which is as easy to find now as it was before today’s meeting.

Follow @GeoffWhite247 on Twitter