Google and Microsoft have agreed to censor internet search results to prevent users finding images of child sexual abuse – but on closer inspection this looks like a tweak to existing measures rather than a radical overhaul.

The search engine giants have compiled a list of up to 100,000 phrases for which results will be vetted to make sure none link to illegal images.

Users will instead get legal results (academic papers and counselling services, for example) and a message warning them about trying to find illegal content.

But there’s a logical quirk here: in order to work out whether to remove a link to a site, Google and Microsoft will have to look at the content of that site to see if it’s illegal.

18 google r w1 Googles censorship move no great leap forward

But if they find illegal content, they should be shutting the site down anyway (for example, making a request via the Internet Watch Foundation to the company which is hosting the content on its computer servers).

If they do that, the site will of course disappear from search results. Furthermore, it will be removed from the internet entirely.

Any action to combat the spread of child sexual abuse images online is welcome, but this amendment to search is not a great leap forward.

There is also the issue of the so-called darkweb – sites hidden from the search engines and only accessible using special software.

Yet the FBI has had some success in cracking down on paedophile networks using these sites, and the demise of the drug market Silk Road has seriously dented confidence in their anonymity.

Follow Geoff White on Twitter @geoffwhite247