Blocks just move child porn under the counter

Google and Microsoft have agreed to install filters on their search engines to prevent them being used to search for child abuse images. Some queries on Google and Bing will be blocked, while others will…

Are search engines really at the front line in the fight against child pornography? GoodNCrazy

Google and Microsoft have agreed to install filters on their search engines to prevent them being used to search for child abuse images. Some queries on Google and Bing will be blocked, while others will produce a warning message alongside filtered results.

It is not as if people need to be told. Child porn has been illegal for a long time in most of the world. The question to ask is why it still hasn’t been eradicated. The ease with which digital media can be stored, transmitted and copied is only part of the problem. Beyond technology, social deprivation, poverty, sex tourism and child trafficking all play a part and require larger, more complex solutions.

Hurdle, not barrier

In terms of effectiveness, the introduction of filters by search engines is similar to moving porn under the counter at a newsagent’s. It creates a useful hurdle to prevent people accidentally or gradually entering the territory but it will not stop a determined person from accessing the material if they really want to. In principle, it will sharpen the boundary between innocent and criminal behaviour but serious criminal behaviour will not be affected. Most of this activity takes place in parts of the internet that are not visible to search engines anyway.

In this case, there is an echo of what happened with the Regulation of Investigatory Powers Act (Part III) in the late 2000s. This UK law requires people to hand over encryption keys if asked and allows them to be jailed for up to two years if they fail to comply. Security experts noted this would have little impact on serious criminals using encryption intelligently. Indeed, the first person to be convicted through this was not a hardened criminal, but someone with mental health issues.

To date, still only a single conviction is known to have been made under this part of the act against a man who refused to hand over the encryption password to his computer to police investigating child porn. Experts suggest most serious online child porn activity is taking place in the heavily encrypted and obscured “dark net” so Cameron’s announced intention to address this area next is very welcome. We can overlook his somewhat hyperbolic assertion that work to track online child porn is somehow comparable to the codebreakers of World War 2.

The trade-off

The basic technological ideas behind these internet search filters are well known. Given the complexity and evolution of natural language, any such filter can only make an educated guess at whether a query is looking for child abuse. Sometimes it will wrongly place a search query in that category (a “false positive”), and sometimes it will fail to identify one (a “false negative”).

Any filtering technique will show a trade-off between these two kinds of errors: reducing one kind will increase the other. Too many false positives leads to inappropriate censorship and too many false negatives makes the filter ineffectual. A middle category with warning messages and selected search results for these child abuse filters alleviates this to a limited extent, although the search engine still needs to choose between the three categories.

David Cameron’s comments that the search engine providers had so far been “unable” to implement these kinds of filters are rather surprising, though. Last century’s search engines started by just looking for bits of text in web pages, but their business model these days relies crucially on being able to decide the relevance of a given web page to a search query.

Both Google and Microsoft have had filtering technology in place to comply with the Chinese government’s censorship on internet search for some time. Google operated compliantly in the Chinese market until 2010; Microsoft’s Bing has collaborated with the main Chinese search engine Baidu since 2011. Clearly they have been reticent to implement it elsewhere, perhaps because blocking search terms, even for laudable causes such as tackling child abuse, raises questions for the future.

If this is indeed the first time such technologies are being rolled out in the UK, it is a landmark moment in internet freedom. Cameron may not be ready to acknowledge it, but after the Snowden revelations, many people will not feel able to trust the UK government not to try to extend censorship into other areas.