UK DNA retention policy

Back in November, I blogged about the grudging way in which the current government appears to have reacted to the European Court of Human Rights’ unanimous verdict, over a year ago, on the retention of DNA samples of those who are arrested but not subsequently charged or found guilty.

The government would doubtless argue that this is a complex issue, in which the ECHR’s verdict must be balanced against the needs of law enforcement; that they have judiciously carried out a consultation exercise; that their proposals create a proportionate retention policy… and so on.

Unfortunately, as I noted in November, the ECHR already disagrees – based on the information which is in the public domain – and in March 2010 it will review the UK’s progress towards compliance with the judgement issued in November 2008.

Legally, then, the situation we have is this: the current policy (of indefinite retention at the discretion of Chief Constables) is reinforced by guidance from the Association of Chief Police Officers (ACPO) that such discretion should be exercised only under “exceptional circumstances”. In other words, the policy position is that the DNA of innocent people should normally be retained indefinitely. That, of course, is what the law still provides for, pending the passing of any new legislation which takes the ECHR’s 2008 ruling into account.

You might think that that leaves police forces in a clear position: the law is unchanged from before the ECHR ruling; ACPO guidance is that the default should be indefinite retention; until new legislation is introduced, the police have to enforce the current law.

Except that that doesn’t fit the observable facts – at least, according to the figures published in this BBC piece today. What it shows is that police forces across the country are responding to DNA deletion requests in ways which vary from “never” (0% of requests granted) to “almost always” (over 80% of requests granted). Of course, one should be wary of reading too much into as bare a set of statistics as those published in the article… For instance, most of the forces which have refused all requests have also had the lowest number of requests for deletion.

However, when I see forces with comparable volumes of requests reacting in widely different ways, the simplest interpretation is that some forces have a default policy of refusal (for instance, in the case of Nottingham and Sussex: 0/16 and 1/28 requests granted, respectively) and others have a default policy of granting (for instance, Cleveland and Cumbria, with 12/17 and 15/19 requests granted, respectively).

So, what conclusion would I draw, as we look forward to 2010 and the March ECHR review of UK policy in this area?

Well – the law as it stands is clear, and has been ruled to be disproportionate. Despite its clarity, it is equally obviously being applied in radically different ways by different police forces across the country. The Home Secretary’s proposals introduce – in the name of proportionality – a wider range of retention periods, depending on the offence committed (or not committed… the DNA of innocent people will still be retained under his proposals).

I can see no prospect that that will result in a more consistent or more uniform application of the law across the country. If anything, it seems bound to worsen the arbitrary inconsistencies which the current statistics appear to demonstrate.

An end-of-year update

As Future Identity completes its first year, it’s the obvious opportunity to do a quick round-up of some of the year’s highlights.

First, “collateral”… on the website (under Portfolio/Resources) you can find three white papers and a couple of slide decks, all on the theme of digital identity, privacy and related topics.

The most recent white paper is one I wrote for the ISTR (Information Security Technical Report); it looks at the difficulty Privacy Enhancing Technologies seem to have had in taking off, and suggests a “maturity model” for working out what the inhibitors might be. The version on the website is a pre-print copy; for the published one, you need to go to Elsevier (who hold the copyright of the version which went to print).

I’ve also got a book chapter coming out in “Financial Cryptography” early next year, on the more general topic of identity management. I’ll post again when that is published.

Second, lots of people kindly comment that I need to get out more ;^) so I have done my best to keep visible and spread the word. It’s ironic how visible you have to be if you want to be a privacy advocate.

It’s a little invidious to pick out specific events for mention, but I’m going to do so anyway. The ones which leap to mind from the last 12 months are:

  • the GENI workshop at UC Davis, California: many thanks to Chip Elliott and Matt Bishop for making it possible for me to attend that so early in Future Identity’s existence;
  • the last Liberty plenaries, in Santa Clara… but also the first Kantara plenaries in Las Vegas;
  • the Burton Group Catalyst conference in San Diego: thanks to Gerry, Bob and Ian for all their help and support with that – here’s to Catalyst EU 2010, in Prague in April;
  • the NetID conference in Berlin;
  • the EU e-Government conference in Malmø
  • and the TERENA and JISC events in Rome and Cardiff… for being such a fun bunch of people to work with…

And third, the people. It’s been fascinating moving from the corporate environment (where, by and large, your colleagues have to work with you) to the consulting world, where people have to want to work with you. With that in mind, I’d just like to thank some of the many people who have been so important to Future Identity in its first year of life:

Toby Stevens, Brett McDowell, Jim Purves, Gus Hosein, Edgar Whitley, Dervla O’Reilly, Britta Glade, Ian Glazer, Bob Blakley, Dave Birch, Trent Adams, Lucy Lynch, William Heath, Adriana Lukas, Sverre Bauck, Lizzie Coles-Kemp, Iain Henderson, Nicole Harris, Alan Stevens… and more others than I can sensibly mention.

Thank you to all of you – and here’s to 2010.

A bitter cup … proffered to us year by year

There’s a piece in today’s Guardian Online by Michael Wills MP, Minister of State at MiniJust. It’s a very reasonable, well-argued article in favour of a balanced dialogue between the government and other stakeholders about public sector retention of personal data.

Reasonable and well-argued, that is, if you haven’t really taken any notice of this topic over the last seven years or so, and have been too busy chewing small pieces of the Daily Mail and shoving them into your ears.

On the other hand, if you have been watching this topic for a while, the article is more likely to come across as a rather sententious, smug bit of policy-laundering. Mr Wills calls for rational, respectful discourse, and accuses critics of the government’s data retention and data sharing policies of resorting to rhetoric instead of looking at the evidence.

I hope he won’t find me too rhetorical or disrespectful if I offer a counter-example.

On the National Identity Scheme, academics and researchers dug for all the evidence it was possible to uncover (while the government did its utmost to prevent its costings of the scheme from becoming known), published their findings in a dispassionate and constructive way, and for doing so, were personally vilified in parliament by the then Home Secretary, Charles Clarke.

On the National DNA Database, the government resisted attempts to persuade it that its collection and retention policies were disproportionate, and continues to drag its feet towards any grudging change of policy, despite an unequivocal, unanimous and scathing judgement against it by the European Court of Human Rights over a year ago.

On ContactPoint, the government has been able to offer no rational explanation of the risk assessment which leads it to conclude that the interests of vulnerable children are best served by centralising data about them and making it accessible to a population of some 330,000 users – the overwhelming majority of whom will have no reason to access the records of any given child.

Mr Wills – when there is so much evidence in the public domain that the government will not engage in constructive debate about its policies on personal data, why should we believe your promises of a new, rational and respectful dialogue?

On a nothing to hiding…

I wasn’t going to write another blog post today – but some things really wind me up, and a particular trivialisation of the privacy debate comes very high on the list.

While I was still at Sun, and had some responsibility for online identity and privacy, I spent years dealing with the fall-out from Scott McNealy’s observation that “you have zero privacy… get over it”. Now, I wouldn’t wish the same fate on those Google employees who I know, because it doesn’t take long to get tired of the smug smirk on the faces of those who throw your chief exec’s remark back at you when you’re trying to argue for better privacy.

That said, I do think Google CEO Eric Schmidt’s recently-quoted remarks on privacy deserve a good deal of push-back. Whatever the full context – and I’m not assuming that that context is reflected in the press coverage – here’s the bottom line; no-one’s privacy interests are served by feeding the media with a sound-bite like this:

“If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place”

And he goes on to hide behind the petticoats of the Patriot Act, casually sliding past the notion that any of Google’s users might live in regulatory regimes with non-US privacy norms, and abdicating the kind of responsibility one might feel entitled to expect from a global corporation.

The real issue with Mr Schmidt’s remark is the way in which it trivialises the concept of privacy – thus ensuring that the issues just won’t get a serious airing. The point about privacy is not that it concerns those things you don’t want anyone to know: that is somewhere else on the scale… somewhere between secrecy and paranoia.

No: the point about privacy is that it’s about the things which you want to be known to some people, but not to others. If Eric doesn’t understand that, then Google deserves a far rougher ride on privacy issues than it has been given to date.

The broader problem – as other articles have observed – is that Mr Schmidt’s statement is basically a re-hash of the old chestnut that “if you’ve got nothing to hide, you’ve got nothing to fear”. Again, no-one’s privacy interests are served by saying things which give even a grain of credibility to that ridiculous expression – unless you like your life to be run on the philosophical principles of the average Christmas cracker motto.

More specifically, here’s why that particular saw irritates me so much. “Having something to hide” expresses a relationship, not a state; you have something to hide from someone. If you’ve got nothing to hide from anyone, you probably don’t live what the rest of us would consider a normal life. Similarly, if you have “something to fear”, you have something to fear from someone.

Now – to spell it out for Mr Schmidt and the other “nothing to hiders”: I have nothing to hide from my bank about my bank account, and how I access it, and what I do with it. I have plenty to hide from a fraudster about my bank account, and how I access it, and what I do with it… and plenty to fear from a mugger who takes me to an ATM at knifepoint and demands that I withdraw cash. A relationship of healthy disclosure from me to my bank is not the same as a relationship of fear, coercion and exploitation with a mugger.

There’s a more insidious influence at work here, too. The idea that “if you have something to hide, you have something to fear” is founded on a presumption that “something to hide” is “something illegal”. It’s something you “shouldn’t be doing in the first place”; others have either a right to stop you doing it, or no moral responsibility to prevent your behaviour from being publicised. That seems to me to elide, quite dangerously, the distinction between what is illegal and what is merely shameful. At its worst, that attitude is intolerant, and culturally insular to the point of arrogance. If that’s the set of social norms Mr Schmidt wants to live by, he’s welcome to it… just as long as I retain the option to be elsewhere.

Proportionality, privacy and pubs

There’s another of those news stories today which makes me wonder if I’ve read it right. It concerns Lancashire Police, who are apparently exercising their power (under licensing laws introduced in 2005) to refuse a license to pubs which, in their view, have “inadequate security camera coverage”. A PC from the licensing department of Preston police is quoted as saying:

“It’s for public safety and their own safety to detect crime. The pub had minimal CCTV – it wasn’t recording. If an incident had happened and we needed to get evidence and locate an offender, we couldn’t have from there. Even the staff aren’t safe in those conditions.”

A little research reveals that this isn’t the first time something similar has been done. This blog post from Canadian privacy lawyer David Fraser describes a similar case in Halifax (Nova Scotia, not West Yorkshire), and also refers to one in the Borough of Islington.

To my mind, these cases raise several questions, perhaps the most important of which is this:

– if the lack of CCTV can be cited (on grounds of public safety and the detection of crime) as a reason for closing down a pub, what locations cannot, logically, be similarly covered? Anywhere I go – whether indoors or outdoors, whether in a public place or someone’s house, I could presumably fall victim to some kind of theft or assault. Why single out pubs?

– Second, there’s the question of evidence. It may well be that the Lancashire police insisted on CCTV in this particular pub because of some history of criminal activity there… but as the piece on the BigBrotherWatch site points out, there are also cases where the presence of CCTV footage has failed to deliver the promised evidence in support of effective law enforcement. Where are the criteria and facts which would support the decision to insist on CCTV in one place and not another?

And that brings us to the third point; given that the police have been granted power over the licensing conditions, the power to insist on disclosure of the personal data collected, and the authority to use that to arrest people, why do they not have a corresponding responsibility to ensure that such systems are operated correctly and accountably? As a simple example – why don’t the police, as part of their licensing responsibilities, have a duty to ensure that any CCTV installation is appropriately labelled with the identity and contact details of the owner/operator, and the purpose for which the cameras are used?

With any other form of data collection, the data controller would be legally obliged to issue a fair processing notice, be identifiable, and be accessible to Subect Access Requests. With any other form of data collection, the notion of informed consent would be applicable. Why is CCTV treated differently?

When it comes down to it, the balance of the Police’s powers in this instance is simply wrong: as thing stand, they have the power simply to compel a landlord to install CCTV or be shut down; surely it would be more appropriate for the police to have to show, as a condition of refusing a licence, that it is appropriate, in this instance, for them to make use of the powers and duties available to them under RIPA. This would impose a requirement to show that the surveillance is proportionate and accurately reflects a need to act on specific risks. It would also introduce an obligation to consider the balancing imperatives of the Human Rights Act.

Whether or not you feel that CCTV in and around pubs is a sensible law enforcement mechanism, there is surely a gross imbalance here between what is currently deemed proportionate in terms of surveillance, and what is considered necessary in terms of accountability and subject access.

Reding [sic] the tea-leaves

In Jose Manuel Barroso’s recent reshuffle of the European Commission, there were a couple of moves which bear some further inspection, from a privacy/identity perspective.

The former Commissioner for Information Society, Viviane Reding, is promoted to one of the Vice Presidents of the Commission, and given a new portfolio as Commissioner for Justice, Fundamental Rights and Citizenship. She has also been given the task of overhauling the Data Protection Directive (now 15 years old…).

Her former role passes to Neelie Kroes, who was previously Competition Commissioner (and oversaw, for instance, some of the Commission’s fiercest battles with Microsoft – on media player bundling, IE/Windows bundling, publication of technical interoperability documentation, Microsoft Office “Open” XML, and so on, and so forth…).

She has a reputation for being able to dive into the detailed technicalities of a brief, and for being extremely tenacious in pushing towards her intended goal.

There’s no doubt in my mind that, had the task of reviewing and revising the Data Protection Directive been left on the Commissioner’s desk at DG InfoSoc, Dr Kroes could have taken it on with competence and determination… which leads me to wonder what the implications are of Commissioner Reding taking it with her to her new role.

With the background of her four years heading DG InfoSoc, Commissioner Reding should have all the subject-matter expertise needed to make a proficient job of revising the Directive. However, what is perhaps more significant is the departmental context in which she will now undertake that work.

Instead of doing it from within DG InfoSoc, she will now do it in the same DG as is responsible for programmes such as this; the development of a framework for a European society based on notions of fundamental rights and rights derived from EU citizenship.

That suggests to me that, if anything, the revised DP Directive will be founded on even stronger links to notions of fundamental human rights and the social/citizenship context.

I foresee some lively discussions of principle between the EU and its partners, particularly where those partners either take a different view of what are fundamental rights, or of how great a role they should play in determining policy on the processing of personal data.

If Commissioner Reding wished to live in interesting times, I think her wish may have been granted.