From suggesting that the Blackberry messenger service be shut down during the 2011 riots, to proposing that the food standards agency can monitor your home internet usage It’s clear that Cameron, or those around him, have a limited appreciation of how the internet works.
Yesterday Cameron announced another clanger – default internet filtering for every household in the country, so unless you tell you ISP otherwise, pornographic content would be blocked – or at least some porn, as Cameron was today forced embarrassingly admit that the Sun’s Page 3 topless models wouldn’t be blocked.
So isn’t it a good idea to at least try to do something? Well yes, parents can educate themselves about how to manage their childrens net usage. Children often know more about technology and the internet than their parents, and would simply end up using a proxy or a VPN to bypass the filters (many children already do this to access Facebook in school), in the words of Graham Jones of the British Psychological Association “The result will be that parents think their children are being protected when those technically proficient youngsters will have found a way around the filters”.
How do we know this? We just have to look at how easily UK internet users can bypass the blocks to the Pirate Bay.
And we need to understand that this filtering isn’t done by rooms full of diligent people, carefully selecting which sites are good and bad, it’s done by a series of algorithms which have to deal with billions of web pages each with differing content. We know that the current filters in place on mobile networks can block perfectly innocent websites which can be damaging to business and it’s hard to remove the blocks once in place, let alone the blocking of websites discussing issues around sex education or sexuality.
Also tacked onto this is another sadly misguided policy to block certain searches related to images of child abuse. While on the surface this may sound like a good idea, in reality it makes absolutely no sense.
One of the first places that paedophile images were exchanged on the internet were newsgroups and forums, where keywords would be used to identify these images – just in the same way Cameron says it is done on search engines. The moderators of these newsgroups and forums then blocked these terms, only for the paedophiles to then start using other arbitrary words to tag and find their depraved content – keeping them one step ahead. We would simply see the same thing happening again on search engines. Australia had similar filters in place for 5 years before scrapping it in favour of a Cleanfeed type system that we already have in the UK.
Moreover, restricting searches for the rest of the law abiding population sets a very dangerous precedent. CEOP will soon cease to exist and will be part of the National Crime Agency, meaning they will then own and administer the proposed filtering and blocking list. Could we next see searches related to D-notices blocked using this system? And there is clear precedent for scope creep. Cleanfeed was originally designed to block images of child abuse, but has been expanded to block copyright infringing websites such as the Pirate Bay.
Even if this change gets through it is unlikely to reduce the amount of online paedophillia, however it would make these offenders harder to identify by driving them to places where search engines cannot reach – such as peer to peer networks where there are an estimated 50,000 offenders. Jim Gamble, the former head of CEOP, today said that Cameron’s proposals would be “laughed at” by paedophiles.
What Cameron is offering is a false panacea which is not backed by evidence, and will simply serve to divert attention (and resources) away from measures that will actually make a difference. I don’t think these proposed changes will make children any safer, it may even do the opposite.
What is needed here is better education for parents, for households to have the choice to opt into filtering (not on by default), more resources for CEOP to catch the paedophiles they know about (last year only 192 offenders were caught using CEOP intelligence) and for Cameron to learn how the internet works.
* Paul Thompson is a London based internet television & film specialist, having worked for the several large media multinationals. He has also contributed to several areas of Liberal Democrat IT policy, in particular the Digital Economy Act and digital intellectual property issues, and has served as the digital intellectual property lead for their IT & Intellectual Property Policy Working Group.
15 Comments
Cameron is only after the easy headline.. his advisers have probably told him he is wasting his breath, but why should he worry if he gets a good headline sounding tough.
Just imagine you want to buy some nice underwear for your wife/girlfriend’s birthday present, you do an on-line search(why shouldn’t you) as M&S wasn’t quite what you had in mind… just wait and see what you are offered..!! As Paul says, it’s an impossible task to try to control porn.
Is Cameron really offering a “panacea”, or are you simply creating a straw man? Where exactly is Cameron, or anyone else, saying that the proposed filter will always work and is the only solution required?
Your argument is a bit like saying we should do away with pedestrian crossings because they give people a false sense of security when crossing the road.
This should not be an either/or case of filters vs education. There is a place for both. Children (and parents) need education, but parents need practical tools as well, and so long as those tools are voluntary, the civil liberty implications are nil. If you believe in the power of education, then you should see no problem in educating parents that they cannot simply rely on a porn filter to do their entire job for them.
I doubt anybody believes that a porn filter can be totally effective, but that’s not the point. Would such a filter significantly reduce the number of children routinely viewing hard core pornography? I think it would. It’s a limited tool, but it would still be a useful one to have.
Stuart, what Cameron is doing is offering a raft of measures which will do very little to keep children safe, but will restrict the rights of the many. He’s completely ignoring the elephant in the room of educating parents and more funding for CEOP, so I think describing what he’s doing as a false panacea isn’t that wide of the mark.
You see the civil liberty implications aren’t nil – that’s why every civil liberties group has come out very strongly against this. Of course filters have their place, but offering them as a default basis and then saying this is some sort of cure all is clearly wrong. It’s like having a pedestrian crossing that only works for some people. I didn’t say do away with filters, I said that they should not be offered on a default on basis – they should be opt in so people don’t walk blindly into a filtered world, and people should be aware that they can be easily bypassed by any net savvy kid.
Well said Paul. The key here is education; non-biased education for kids /teenagers (and some adults apparently) on pornography and education for parents on how to control access to certain content in the home environment.
@Paul Thompson
“Stuart, what Cameron is doing is offering a raft of measures which will do very little to keep children safe, but will restrict the rights of the many. ”
Have you actually read the proposals? Nobody is having any of their rights diminished. They are still free to watch vulnerable adults being exploited for their own gratification at their will, all they have to do is opt-in.
“You see the civil liberty implications aren’t nil –”
Yes they are. Nil, Nil, Nil. Name a civil liberty organisation that’s had something negative to say about it then. I’ve just had a look at Liberty’s website and can’t see anything.
“for households to have the choice to opt into filtering”
You seem to have completely destroyed your own case by arguing that filtering is dangerous and ineffective, and then suggesting people should able to opt in to this dangerous and ineffective filtering.
One of the most outrageously silly arguments surrounding this is that we shouldn’t bother trying because some kids will find a way around it. Perhaps we should apply the same logic to burglaries. Burglars will always find a way so therefore we shouldn’t bother with locks on our doors, alarms, anti-climb paint, etc – it just gives a false sense of security! Of course there needs to be a rigorous cost/benefit analysis of these proposals, but the idea that we should give up fighting against people who flout society’s conventions, whether it’s the right to personal property or the ability to prevent our kids witnessing violent material, is absurd.
@Julian – just before that you’ll see I called for better education for parents – that includes knowing that your kids can get round these filters if they want to. Some parents will want to have filters for their peace of mind, but they have to understand that they are not a panacea, and they won’t block all porn. The solution proposed by Cameron is worse because it would block porn for all members of the household, meaning that if the adults want to watch something “adult” they have to disable the filter for the whole household.
@Steve, we know that filters (such as those offered by ISP’s and Mobile operators right now) block things they shouldn’t, such as sexual health sites and those supporting :LGBT teenagers – here’s a report you might find interesting – http://www.openrightsgroup.org/assets/files/pdfs/MobileCensorship-webwl.pdf
Businesses would get block for having the wrong set of keywords, it would become almost impossible to regulate because of the billions of pages and search combinations involved. Our rights to be able to freely access information would be damaged.
As for civil liberties groups, its’ still quite early, but some that I’m aware of are:
Big Brother Watch – Nick Pickles the director has criticised this saying “Website blocking will end up getting very messy in court. Legit biz gets blocked, who pays for loss of earnings/reputational harm?”
The Open Rights Group have heavily criticised this – https://www.openrightsgroup.org/blog/
Padraig Reidy, of the Index on Censorship has said “Families should be able to choose if they want to opt in to censorship. If a filter is set up as a default then it can really restrict what people can see legitimately. Sites about sexual health, about sexuality and so on, will get caught up in the same filters as pornography. It will really restrict people’s experience on the web, including children’s.”
@Paul Thompson
Thanks for the list, however, I’m struggling to take any of them seriously.
Big Brother Watch – an organisation founded by the same chap that founded the taxpayer’s alliance and currently has one of its leading articles defending a former member of the SAS who stole a pistol and 500 rounds of ammunition and then proceeded to keep them under his bed.
The Open Rights Group is, according to their website: ” the UK’s leading voice defending freedom of expression, privacy, innovation, creativity and consumer rights on the Internet.”. I’m hardly surprised that a group campaigning for a lack of regulation of the internet wants a lack of regulation of the internet.
Padraig Reidy doesn’t make logical sense to me. He starts by arguing that people should have to opt-in to the filtering, not opt-out. However, his second sentence is arguing against the use of any filters, which has no relevance to whether people need to opt-in or opt-out. The phrase ‘opt in to censorship’ is spin – censorship, by definition, means that an authority is preventing you from seeing something. You cannot opt-in to censorship. All those people denied access to facebook in China aren’t electing for it to be censored. Opting-in or opting-out are both a question of individual choice. Individual choice is not censorship.
@Steve
I think we’ll have to agreed to disagree about the civil liberty organisations – you can discredit pretty much any organisation is you want to take that angle . Although I will ask Liberty if they have a line on this. Given their opposition to the snoopers charter, I wouldn’t be surprised if they have something to say.
I think a key point you’ve made is “Opting-in or opting-out are both a question of individual choice. Individual choice is not censorship.” which is quite true – if that was the case.
Have you ever installed any computer applications? Or have you ever seen anyone technically illiterate install an application or to have them taken through a “wizard” style process. Most people just click on “next” rapidly to get through the process as quickly as possible, others may not understand what is going on and simply click “next” regardless. For example, you may find with lots of free software as part of the installation process they now bundle a browser search bar, and you’ll get a screen with option asking if you want to install this (pretty pointless) search bar with the options ticked as a default – and many people end up with said search bar because they clicked next too rapidly or they didn’t understand what was going on. It’s for the latter reason that I see my fathers computer he’s got about 3 pointless search bars installed, that he doesn’t know where they came from or how to get rid of them. I’ve even been caught out by this.
In the proposed user flow (which I’ve seen) options to filter out adult content are ticked as default. Just by clicking next you’ll end up with filtering. Not everyone will bother to read what they are clicking next to, not everyone will understand. Not everyone will understand why some pages aren’t appearing, or what to do to get them to appear therefore you have censorship that people didn’t knowingly choose to be a part of – and this will affect more people than you think.
I cannot think of any previous occasion on which I have agreed with Paul Staines, however if it is true that he is suing Claire Perry for defamation, then it might be a first. I hope he drives her out of politics altogether. The woman is patronising, bossy, technically clueless and very, very dangerous.
How about a compromise? ISPS should have default-on controls for families with children. They get loads of info when they sign you up. They could just have a box that asks you if you have children, That would get most of the gain without annoying loads of adults.
I’m not convinced that educating parents really helps that much relative to an effective filter. Or do people that oppose filters think that only kids of poorly educated parents end up being sucked into this? I’m pretty happy to let my young kids use the internet as a tool but I cant watch them every moment of the day. I’m sure that filters would have a negative impact on the revenues of Google et al, but other than that what is the big issue when consenting adults can easily switch the filter off?
@Alistair
Yes, if only there was such a thing as an effective filter that was able to operate on the fly, sensitively and contextually identifying problematic material with zero false positives, seamlessly enabling access to material for those who need it whilst gently discouraging access for those who would probably just end up having nightmares. That’d be brilliant. Makes you wonder why we didn’t think of deploying such wonder-technologies instead of using clunky blacklists to police against abusive stuff in the first place. Makes you wonder why, when we have these brilliant technologies just waiting in the wings, we don’t throw out Google and use our magic filter instead for web search and content recommendation, because it sounds much better than any real-life technology currently available.
In real life, filters are basically just a list of no-go zones. Make them too extensive and intrusive or block stuff that reasonable individuals see as inoffensive and the result will be that the responsible adult switches the filter off. On a home PC, it is technically possible to give each family member tailored filters (Andrew is 3 years old, Bob is 7, Rachel is 13, granny is 73), but this is a tall order for the default monolithic blacklist on the ISP end. Technically, therefore, this sort of thing is what’s known as ‘not a very good solution’. It assumes that technology can do stuff it can’t; it ignores the many much more user-friendly alternative approaches; therefore, it would result in filters that will either be toothless and hence ignorable or over-eager and hence quickly disabled.
Governments know nothing about technology and less about requirements analysis and engineering. For some reason many apparently believe that despite these minor setbacks they should be placed in control of tailoring the user experience for 21 million households across the UK. Those who can’t… legislate.
Paul Thompson:
“Not everyone will understand why some pages aren’t appearing, or what to do to get them to appear therefore you have censorship that people didn’t knowingly choose to be a part of”
Nice try but that still does not amount to censorship. Besides, one could equally point out that under the current system, people who sign up to broadband accounts are unknowingly signing themselves and their families up to a limitless hard-core pornography library, which ought to offend you just as much. At least now it looks like users will be offered some kind of choice, and I thought liberals were all in favour of choice. (If the industry had ever bothered to offer this kind of choice itself through an opt-in system, then Cameron’s proposals would be much more difficult to get off the ground.)
daft h’a’porth:
You have a very low expectation of what is possible with technology. Every new product must be a constant source of wonder to you. There is no reason why filtering technology could not be massively improved if only the industry would apply a little more effort and resources to it.
Cameron is accused of offering a false panacea, but he hasn’t. He knows there are limitations and he has acknowledged that explicitly. The only people talking about panaceas are critics like yourself who would have us believe that unless the new filter works 100% accurately 100% of the time for 100% of users, then it “doesn’t work”. This is an unrealistically high threshold of success you are applying. Virtually nothing would work by those standards. (To compare and contrast with an unrelated example, note how every liberal in the country will tell you that community justice “works”, even though it only cuts reoffending rates by a tiny handful of percent, and even that only for certain types of offender.)
Governments may know nothing about technology, but on the other hand, many of the government’s critics within the technology industry seem to know nothing about such matters as parenting and ethics. That’s why government and the industry need to work together on this.