The price of pseudonymity

Paul Bernal and Lawrence Serewicz both left such thought-provoking comments on my previous post that I thought it worth a quick follow-up blog, just to keep the topic ‘live’ for a little longer.

Paul pointed out the balance between the benefits that can come from allowing anonymity/pseudonymity, and the harm that can result from making both impossible. Lawrence examined some of the many broader implications that can have for the relationship between the individual and the state. 

Language is a delicate thing, and it occurred to me that politicians will often draw a very sharp line between something which they are prepared to say, and something almost identical which they are not. For instance, I’m pretty sure I have heard a politician say something along these lines:

“A certain level of anti-social activity online is the price we pay for living in a free society”

But I can’t remember ever having heard the following:

“A certain level of anti-social activity online is the price we pay for living in a free society, and it’s a price worth paying”

Politically, it seems to be unacceptable to acknowledge that any level of bad behaviour (or crime, come to that) is a price worth paying for the social benefit of not living in a police state… and yet no-one could plausibly say it’s not the case.

I wonder why that is?

 

Privacy, anonymity and perverse consequences

I very much doubt Andy Smith, of the UK Cabinet Office, thought that his remarks yesterday about disclosing fake personal details would generate quite the flurry of comment they did. But that’s the problem with the whole topic of online privacy and anonymity: it is fraught with unexpected and sometimes perverse consequences.

I won’t re-hash the background here; you can get a far better write-up from Alec Muffett’s excellent piece of re-contextualisation and analysis, here, and Joanna Geary’s article in the Guardian, here.

What I will do, though, is quote a participant at last week’s OECD Working Party on Information Security and Privacy:

“It’s not a matter of designing for privacy *or* security: the proper goal is to optimise for both”

That’s not to say, by any means, that it’s an easy problem. Far from it. Let me give a couple of examples of how challenging it can be to do this well:

1 – Perverse consequences

If you oblige people to prove that they are, for instance, under 18 in order to use a ‘safe online chat environment’ for young people, one foreseeable consequence is that you create a market for the malicious use of valid credentials. Now put yourself in the position of a young person with valid “under 18” credentials; are you really any safer now that a bully or child abuser has a strong incentive to bribe, threaten or cajole you into lending them your credentials?

2 – Is it OK to lie about your age?

Helen Goodman MP may, like Andy Smith, be wondering if she could perhaps have expressed herself better yesterday. Her line of reasoning appears to have been as follows:

“If you let people lie about their personal details online, you make it possible for them to lead potential victims towards abuse. Therefore it should be made illegal to give false personal details online.”

Unfortunately, whether or not the premise is true, the proposed solution doesn’t work. If you make it illegal to give false personal details online (and if, for the sake of argument, you are able to enforce such a law), you also expose individuals to risks of identity theft, fraud, reputational damage, and in some cases physical harm unless you can also create a legal and regulatory environment in which it is impossible for that data to be subsequently abused, either accidentally or on purpose. Since Helen Goodman must know that that, in turn, is also impossible, her line of reasoning looks increasingly foolish.

And let’s not forget the basics: a ‘date of birth’ disclosure, to a third party who has no means of verifying it, is an essentially unreliable self-asserted attribute. As a security mechanism, it is only made weaker if the information in question is known to multiple third parties.

Unfortunately, the bottom line is that optimising for privacy and security is tough, and tough problems are never solved by soundbite. In this instance, it needs (at least) realism about policy objectives, clarity about what ‘authentication’ really is, and an understanding of what the internet means for personal privacy. Then, perhaps, we can start to close the reality gap between policy aspirations and social reality.

The UK and CCTV

Having just got back from a holiday, I looked through the various news feeds, twitter traffic and so on, to see what bloggable stuff has been happening recently. There was no shortage. I could have written something about the winner of NIST’s 5-year competition to find a successor to the SHA-2 hashing algorithm – but that would have turned into a longish piece on risk analysis and mitigation… so perhaps another day. Or I could have picked out Facebook’s inexorable post-IPO slide towards selling your personal data – but you already know how I feel about Facebook… so perhaps another day.

No, for today I eventually settled for a quick comment on the UK’s new Protection of Freedoms Act 2012. Although this seems to have hit the news only insofar as it restricts the use of wheel-clamps on private land, the Act is a multi-headed beast including changes to the law on retention of biometric data, regulation of surveillance cameras, safeguarding of vulnerable groups, and provision for “certain convictions for buggery” to be struck from the record once spent. Yep, seriously. And no, I’m not going to make any reference to the legislative priorities of the coalition government and the anecdotal proclivities of Conservative or Liberal politicians. That would just be cheap and tawdry. So, perhaps another day…

On the face of it, I am glad to see this Act enter the statue books. I am in no doubt that our Freedoms are in desperate need of some legal Protection – the last government’s illiberal and disproportionate policies had a serious impact on civil liberties, and a swing in the opposite direction is long overdue. The coalition’s execution of its commitment to ‘roll back the database state’ has been lacklustre at best, and the threats to civil liberties have continued to grow apace (just think of all the measures that surrounded London’s hosting of the 2012 Olympics). The question is – is the Protection of Freedoms Act 2012 the answer?

The section of the Act I want to focus on is Part 2, Chapter 1 – Regulation of CCTV and other surveillance technology. Section 29 describes the government’s obligation to draw up a code of practice for the development and use of such systems. And about time too, you might think – as the only guidance up to now on this topic has been the Information Commissioner’s advice on how the Data Protection Act applies to CCTV systems. The ICO guidance applies – broadly speaking – to any deployer of CCTV. And that raises an interesting question about the Protection of Freedoms Act 2012: why does the Act seek only to regulate public sector bodies, and completely ignore the question of governance of commercially or privately operated surveillance systems? (After all, if it can legislate on private wheel-clamping, why not on private CCTV?).

The Act raises other questions, too. The Act calls for the appointment of a Surveillance Camera Commissioner, who is to be consulted by the Secretary of State when the latter is drawing up the code of practice. The Secretary of State must also consult the Information Commissioner (fair enough) and the Chief Surveillance Commissioner.

The who?

You could be forgiven for not knowing that the UK has a Chief Surveillance Commissioner. Certainly, the impact of that office on the proliferation and regulation of CCTV systems across the UK appears to be negligible. As it turns out, the Commissioner publishes an annual report, and has done so for the last five years. the report for 2010-2011 is online here. However, as you can see from the CSC’s website, the office is primarily concerned with the governance of covert surveillance (including informants and undercover officers)… so one can expect the CSC to have minimal effect on the commercial or non law-enforcement deployment of CCTV systems. The Protection of Freedoms Act does not appear to change the CSC’s remit in any way – so it’s not clear to me how the roles of CSC and Surveillance Camera Commissioner will interact, if at all.

It’s also not clear to me whether the Protection of Freedoms Act covers the deployment of airport body-scanners. As the last few years have shown, these “nudi-scan” machines remain a contentious topic, with public concern over their safety and their practicality. If any public sector surveillance technology was ripe for a more open regime of supervision and accountability, is is surely these – and yet, they don’t appear in the Act at all, as far as I can see.

All in all, the Protection of Freedoms Act addresses some extremely important topics. I hope it is a step in the right direction, but I think it would be a big mistake for the legislators to sit back and congratulate themselves on the completion of a job well done, because the Act is neither flawless nor complete.