We've covered the ridiculousness of the UK's "voluntary" web filters. UK officials have been pushing such things for years and finally pushed them through by focusing on stopping "pornography" (for the children, of course). While it quickly came out that the filters were blocking tons of legitimate content (as filters always do), the UK government quickly moved to talk about ways to expand what the filters covered.
The pattern is not hard to recognize, because it happens over and over again. Government officials find some absolute horror -- the kind of thing that no one will stand up for -- to push for some form of censorship. Few fight back because no one wants to be seen as standing up for something absolutely horrific online, or be seen as being against "family values." But, then, once the filters are in place, it becomes so easy both to ignore the fact that the filters don't work (and censor lots of legitimate content) and to constantly expand and expand and expand them. And people will have much less of a leg to stand on, because they didn't fight back at the beginning.
That appears to be happening at an astonishingly fast pace in the UK. Index On Censorship has a fantastic article, discussing how a UK government official has already admitted to plans to expand the filter to "unsavoury" content rather than just "illegal."
The pattern is not hard to recognize, because it happens over and over again. Government officials find some absolute horror -- the kind of thing that no one will stand up for -- to push for some form of censorship. Few fight back because no one wants to be seen as standing up for something absolutely horrific online, or be seen as being against "family values." But, then, once the filters are in place, it becomes so easy both to ignore the fact that the filters don't work (and censor lots of legitimate content) and to constantly expand and expand and expand them. And people will have much less of a leg to stand on, because they didn't fight back at the beginning.
That appears to be happening at an astonishingly fast pace in the UK. Index On Censorship has a fantastic article, discussing how a UK government official has already admitted to plans to expand the filter to "unsavoury" content rather than just "illegal."
James Brokenshire was giving an interview to the Financial Times last month about his role in the government’s online counter-extremism programme. Ministers are trying to figure out how to block content that’s illegal in the UK but hosted overseas. For a while the interview stayed on course. There was “more work to do” negotiating with internet service providers (ISPs), he said. And then, quite suddenly, he let the cat out the bag. The internet firms would have to deal with “material that may not be illegal but certainly is unsavoury”, he said.It goes on, in fairly great detail, to describe just how quickly the UK is sliding away down that slippery slope of censorship. It highlights how these filters were kicked off as an "anti-porn" effort, where the details were left intentionally vague.
And there it was. The sneaking suspicion of free thinkers was confirmed. The government was no longer restricting itself to censoring web content which was illegal. It was going to start censoring content which it simply didn’t like.
But David Cameron positioned himself differently, by starting up an anti-porn crusade. It was an extremely effective manouvre. ISPs now suddenly faced the prospect of being made to look like apologists for the sexualisation of childhood.And, of course, the fact that the filters go too far, is never seen as a serious problem.
Or at least, that’s how it was sold. By the time Cameron had done a couple of breakfast shows, the precise subject of discussion was becoming difficult to establish. Was this about child abuse content? Or rape porn? Or ‘normal’ porn? It was increasingly hard to tell.
The filters went well beyond what Cameron had been talking about. Suddenly, sexual health sites had been blocked, as had domestic violence support sites, gay and lesbian sites, eating disorder sites, alcohol and smoking sites, ‘web forums’ and, most baffling of all, ‘esoteric material’. Childline, Refuge, Stonewall and the Samaritans were blocked, as was the site of Claire Perry, the Tory MP who led the call for the opt-in filtering. The software was unable to distinguish between her description of what children should be protected from and the things themselves.But, of course, no one in the UK government seems to care. In fact, they're looking to expand the program. Because it was never about actually stopping porn. It was always about having a tool for mass censorship.
At the same time, the filtering software was failing to get at the sites it was supposed to be targeting. Under-blocking was at somewhere between 5% and 35%.
Children who were supposed to be protected from pornography were now being denied advice about sexual health. People trying to escape abuse were prevented from accessing websites which could offer support.
And something else curious was happening too: A reactionary view of human sexuality was taking over. Websites which dealt with breast feeding or fine art were being blocked. The male eye was winning: impressing the sense that the only function for the naked female body was sexual.
The list was supposed to be a collection of child abuse sites, which were automatically blocked via a system called Cleanfeed. But soon, criminally obscene material was added to it – a famously difficult benchmark to demonstrate in law. Then, in 2011, the Motion Picture Association started court proceedings to add a site indexing downloads of copyrighted material.And it just keeps going on and on. As the report notes, "the possibilities for mission creep are extensive." You don't say. They also note that technologically clueless politicians love this because they can claim they're solving a hard problem when they're really doing no such thing (and really are just creating other problems at the same time):
There are no safeguards to stop the list being extended to include other types of sites.
This is not an ideal system. For a start, it involves blocking material which has not been found illegal in a court of law. The Crown Prosecution Service is tasked with saying whether a site reaches the criminal threshold. This is like coming to a ruling before the start of a trial. The CPS is not an arbiter of whether something is illegal. It is an arbiter, and not always a very good one, of whether there is a realistic chance of conviction.
As the IWF admits on its website, it is looking for potentially criminal activity – content can only be confirmed to be criminal by a court of law. This is the hinterland of legality, the grey area where momentum and secrecy count for more than a judge’s ruling.
There may have been court supervision in putting in place the blocking process itself but it is not present for individual cases. Record companies are requesting sites be taken down and it is happening. The sites are only being notified afterwards, are only able to make representations afterwards. The traditional course of justice has been turned on its head.
MPs like filtering software because it seems like a simple solution to a complex problem. It is simple. So simple it does not exist.Of course, if you recognize that the continued expansion of such filters was likely the plan from the beginning, then everything is going according to plan. The fact that it doesn't solve any problems the public are dealing with is meaningless. It solves a problem that the politicians are dealing with: how to be able to say they've "done something" to "protect the children" while at the same time building up the tools and powers of the government to stifle any speech they don't like. To those folks, the system is working perfectly.
No comments:
Post a Comment