The thing that no one should forget: Most of Cameron’s new proposals to police the web for porn won’t work. They certainly won’t do anything much to catch the hard core offenders.
Filters don’t work as they block too many harmless pages and miss too many targeted ones.The net is cast too widely to cover all porn with blanket bans, conflating the seedy-but-legal with the vile-and-illegal.
Pro-active choice conditions – where adults have to confirm whether they want to deactivate parental controls blocking adult content – are easily circumvented. Who knows better how to work the web but kids anyway? For them half the thrill of the internet is in earning the tech savvy to use cool tools to transgress borders of musical taste, privacy and public-private naughtiness.
Worse, default censorship encourages parents to stop thinking about the problem, and gives Cameron an easy way to claim that “something is being done”.
True, Cameron’s pop-up ‘flash warnings’ might deter a few of the uncommittedly curious from accidental connections with porn. They make some sense as TV style “viewer discretion advised” notices – especially as the internet continues to morph into a broadcast service – but will hardly deter the committed pervert.
Most ultra-illegal porn is exchanged person-to-person in the closed corners of the net, the ‘dark net’, where all the rest of the millions of other illegal transactions made online are done.
Its denizens know what they are doing is illegal, can take technically simple steps to evade observation and certainly don’t need to search for it on the open web. So beating up on the search engines won’t work either.
Search companies and ISPs don’t want to take responsibility for the war on child porn, because it’s expensive, difficult and means potential legal liability if they fail.
They don’t expect results either. How effective was the ‘for sale to 18s-only’ rule on Playboy magazine in keeping it out of the hands of pre-internet era 13-year-olds? So they prefer to hold on to the idea of innocent until proven guilty.
For the rest of us it’s the basic principle of whether we should allow search engines to decide what we should and shouldn’t see. It is better that a court decides what’s illegal online.
Search companies like Google remove search suggestion options from searches for possibly illegal content, but not automatically ban the search itself, but will be expected to do more. Google is developing tools to tag illegal images so they can be more quickly spotted when reposted.
They can also expect more legal requests to take down links. But those legal requests have to come from the police and the Child Exploitation and Online Protection centre more strategically – targeting the pornographers, not the porn itself.
Cameron plans to give more power for to investigate the ‘dark web’ – the closed parts of the internet where paedophiles share illegal images among themselves, peer-to-peer.
And technical fixes like setting up a national database of child abuse images for the police and child protection agents to use would not be such a bad idea.
It’s potentially more practically useful, even if it won’t play as well with the Daily Mail as active censorship would. But the aim should to follow the trail back to the initiators, and, not least, the real victims. We need more effort to police the crime, not police the internet.