A Database State: Where is the data?

A report commissioned by the Joseph Rowntree Reform Trust, entitled ‘Database State’ (Anderson et al 2009), has captured significant media coverage. Reuters headlines its coverage of the report with ‘Quarter of state databases “should be scrapped”. The claims are indeed alarming, and it is not surprising that they have gained media attention. Moreover, the report has been written by respected academics, including colleagues of mine, who have real expertise in security issues.

In this case, however, I question my colleagues’ findings. As the Reuters story notes: ‘The Ministry of Justice said the report had no real evidence to base its assessment’ (Tim Castle 23 March 2009). On this one, I must side with the Ministry of Justice, until the authors can convince me otherwise.

The report does not explain its methodology or the nature of the evidence on which the authors draw their conclusions. The report notes that the UK public sector has ‘an enormous number of databases’ (p. 11). One agency alone, the Serious and Organised Crime Agency is said to have over 500 databases (p. 11). However, the report focuses on 46 databases across the major departments of the entire UK government (p. 4), and provides no sense of how these 46 were chosen. So there is a serious sampling issue. Because journalists are drawing conclusions that suggest this sample is somehow representative of all databases, it is very important to spell this out. Even the Chair of the Rowntree Trust writes of only six given a ‘green light’ (p. 2), as if this was a representative sample.

Even if we disregard the sampling methodology, there are other issues of measurement. The 46 datasets were graded by a ‘traffic light system’ with red indicating that that a system is not compliant with the European Convention on Human Rights (ECHR) and that the design is such that it could not be made compliant without substantial redesign. The reader should not be in a position that requires us to trust the judgment of the authors, based on their authority, but there is no evidence provided to substantiate these ratings. Nor is there a methodology spelled out for applying or operationally defining this rating. Could someone replicate this?

In fact, from the references, it is not clear whether the authors went beyond desk or screen-based research. The acknowledgments indicate that various colleagues fed them ‘market intelligence’, but that is a problematic source for a systematic study. There do not appear to be personal interviews or field visits to meet with those managing these datasets or examine the systems. The risk is that we are reading weak journalistic coverage of research based on journalism and the input of pundits with similar views.

Their study is on a key topic at an important time. It seeks to build on a long-term debate over the role of computerization on privacy and data protection. Alan Westin and Michael Baker’s (1972) study was seminal in this area, but based on major field studies, and survey research, indicating the scale of research required in order to generate evidence. Undoubtedly available resources limit the rigour of the present study, but the limits of the study need to be clearly spelled out.

Debate over privacy and data protection is critical, but it could be undermined unless we know something more authoritative about the problem. When the Guardian (Travis 2009) reports that the ‘Right to privacy broken by a quarter of UK’s public databases, says report’, it is important to know more about the evidence on which these recommendations are based, and not rely on the authority of the authors.


Anderson, R., Brown, I., Dowty, T., Heath, W., and Sasse, A. (2009), Database State: A Report Commissioned by the Joseph Roundtree Reform Trust Ltd. York, UK: The Joseph Roundtree Reform Trust Ltd. http://www.jrrt.org.uk/uploads/Database%20State.pdf

Castle, T. (2009), ‘Quarter of State Databases “should be scrapped”’. Reuters, 23 March at: http://uk.reuters.com/article/domesticNews/idUKTRE52M04N20090323

Travis, A. (2009), ‘Right to Privacy Broken by a Quarter of UK’s Public Databases, Says Report’, Guardian, 23 March: http://www.guardian.co.uk/politics/2009/mar/23/dna-database-idcards-children-index

Westin, A. and Baker, M. A. (1972), Databanks in a Free Society: Computers, Record-Keeping and Privacy. New York: Quadrangle Books.

9 thoughts on “A Database State: Where is the data?

  1. Posted for Barry Blundell, with permission:

    Dear Professor Dutton,

    I’m currently at Auckland University of Technology in New Zealand (School of Computing and Mathematical Sciences), and am writing a book entitled ‘Digital Shadows’. This discusses a broad range of biometric, algorithmic surveillance and data mining technologies along with so-called ubiquitous systems which are used directly or indirectly to record data pertaining to the individual.

    Earlier this week I downloaded the ‘Database State’ report and after studying this I e-mailed the chairman, Professor Ross Anderson, with several queries relating to the report. From what I had previously heard, I was hoping that this report would represent an in-depth analysis of the state of play as the title suggested to me that this would be a wide-ranging document.

    Earlier today I read the discussion that you initiated on the OII under the heading ‘A Database State: Where is the Data?” and was most interested to read your comments, and also the responses that these elicited. The comments you made reflect my own concerns about this report and I feel that whilst it may well attract media headlines, the authors have not clarified numerous points which they have made.

    Although I was most appreciative that Professor Anderson responded promptly to my e-mail, I cannot help but feel that his comments reinforced my concerns. By way of example, in the executive summary various red, amber and green databases were listed. Whilst the red category is listed fully, the amber and green categories are listed only partially. I asked if there was a criteria as far as listing these was concerned, perhaps the authors felt that the data-based listed were indicative. However Professor Anderson’s response to this query was that [paraphrasing] it was a matter of space, fitting the leaflet.

    I had clearly failed to make an obvious deduction and had been assuming that there was a more in-depth reason for this selection. As far as obtaining any further information, Professor Anderson simply states: “If you want further detail then you must chase up the references, there are over 200 citations and they span all our important research sources”.

    Whilst I am endeavouring to develop a scholarly book, it is intended for a wide audience and my reason for undertaking this work stems from a real concern over the way in which technologies are currently being used. In addition, I am endeavouring to look to the future and consider, for example, the ramifications of technology convergence and technology escalation. I am doing this on the basis of having been involved in computing since the mid-1970’s (although my background is in physics). In this respect I don’t feel that the ‘Database State’ report is helpful (although it does provide some good summary information). On the other hand, there is the issue of focusing upon legality as opposed to focusing on far more important issues – particularly in terms of considering the very large number of databases which have spawned within the UK in a holistic manner – within the context of the ability to access these databases remotely – both nationally and internationally. However, a further clue was provided by Professor Anderson that the report was limited to 64 pages. This could possibly explain why important detail is missing, and why more overarching issues have been neglected…

    Overall, it is also unclear to me as to whether this research went beyond screen-based research, which is a great pity. … By focusing upon vague legal issues in the ‘Database State’, within the context of a traffic lights scheme, the authors of the report have been able to gain attention in the popular press , and I think this is a pity given the seriousness of the issues under discussion.

    Kind regards.

    Yours sincerely,


  2. dear bill –

    (as explained in an earlier posting, i was not one of the authors of the JRRT/FIPR report but advised on the european standards to be applied.)

    two points if i may:

    1. you cannot criticise the JRRT/FIPR report because “journalists are drawing conclusions that suggest this sample is somehow representative of all databases”, when that claim is not made anywhere in the report itself, nor suggested or implied. quite the contrary: the report is quite clear that it deals with a large number of major databases, ie the ones that have the biggest human rights impact. that was the target, the aim was not to provide some kind of overall measure of all public-sector databases. your criticism in this respect should be directed at the journalists, not the research team.

    2. on the legal basis for the assessments, as i said in an earlier posting, the relevant standards have been set out in detail in earlier FIPR publications, to which the report refers. there is a significant section on these standards in chapter 3, section 3.1, on “privacy and human rights”.

    the standards are also repeatedly briefly re-capped in the text of the report where relevant to a particular issue or database. the following examples relating to the first few databases to be marked “red” may suffice (references are to the pages in the hard-copy version of the report; the pdf version may be a page out):

    * on p. 14, about the NHS secondary uses services:

    “European law requires that systems which store sensitive personal information such as medical records either have the free and informed consent of the data subject, or have specific legal provisions that are sufficiently narrow to make their effect predictable; such provisions must also be proportionate and necessary in a democratic society. This law is grounded in the European Convention on Human Rights and is codified in the Data Protection Directive; the EU’s Article 29 Working Party provides guidance in the case of medical records. It has also recently been elucidated by a judgement of the European Court of Justice, according to which health care staff not involved in the care of a patient must be unable to access that patient’s electronic medical record: ‘What is required in this connection is practical and effective protection to exclude any possibility of unauthorised access occurring in the first place.’

    For these reasons, the use of SUS in research without an effective opt-out appears to contravene the European Convention on Human Rights and European data-protection law. It is also considered morally unacceptable by millions of UK citizens. For these reasons alone, and quite apart from any privacy concerns about the use of SUS data in administration, we have no choice but to assess this system as Privacy impact: red.”

    (the passage contains no less than three footnote references to further legal detail, here omitted)

    * on p. 16, about the NHS detailed care record, which is criticised for a “wikipedia model of uncontrolled collective ownership” in which there is no clearly identifiably controller (which everyone with some data protection knowledge knows is a basic data protection requirement):

    “Given that the whole data protection system hinges on the duties of the controller, and that patients mostly trust their doctor but distrust ministers and officials, any move to make the Secretary of State the data controller rather than the doctor undermines both legal protection and trust. … the DCR must be assessed as … red”

    * on p. 17, about ContactPoint, it is noted that there are privacy concerns over too many people having access to often highly sensitive data (such as the use of sensitive services), concerns about data security identified by Deloitte, and the absence of an effective “opt-out”. all three issues raise important data protection questions and justify the conclusion that the system is likely to fall foul of european human rights and data protection law.

    * on p. 20, about ONSET, a system in which children who have not been convicted of any crime are identified as “potential offenders”, it is said that this makes it more likely that such children will be treated as suspects rather than as victims or witnesses, and that therefore “we [the authors] believe that ONSET contravenes the ECHR.” i am happy to defend that conclusion, even if there is no specific case in strasbourg to that effect as yet (mostly because other european countries realise that it is highly objectionable to set up such profiling systems, especially on children).

    * on p. 23, about the national DNA database, the report noted that this was already found by the european court of human rights to violate the ECHR. the assessment of “red” is here undisputably correct. indeed, since that assessment elsewhere only indicates that a system is “highly likely” to breach european standards, it is here valid a fortiori.

    * on p. 25, about ID cards and (more importantly) the database behind it, the report says:

    “”While the Register will not contain other sensitive government-related information, a National Identity Number will make it easier to link together information held on individuals across other public-sector databases. This is worrying because in the UK, unlike in other EU States with strong constitutional protection, there are few safeguards against excessive data exchanges. Indeed, the Government appears to be bent on removing such safeguards as do exist.”

    i hope this may suffice. as i said in my earlier posting, i (and the authors, i am sure) would be happy to discuss the methodology and standards applied further. but it would help if your criticism focussed on their work rather than mistakes by journalists, and if it was made on the basis of some demonstrable knowledge and understanding of european human rights and data protection law.

    your sincerely,

    douwe korff

  3. I am posting the following text sent from David Erdos, Katzenbach Research Fellow at the Centre for Socio-Legal Studies at Oxford. He had difficultly successfully posting his comment on WordPress. Please let me know if others experience difficulties.

    David wrote:

    “I read with great interest not only the Database State report itself but also this very engaging discussion. It seems to me that Bill’s original criticisms are valid and should be taken seriously by the reports authors and those linked to its sponsors, the Joseph Rowntree Reform Trust (JRRT) and the Foundation for Information Policy Research (FIPR). One problem Bill mentions is that the sample used in the report is unrepresentative and even perhaps arbitrary. This seems correct. An even greater problem, however, is that the allegedly legal analysis carried out by the authors of the report is both opaque and tendentious. In the first place, the legal standard under which the databases are being judged is never unambiguously spelt out – are we talking about compatibility with the Data Protection Act (1998), the Human Rights Act (1998), the European Convention on Human Rights or the EU’s Data Protection Directive (1995)? Secondly, in much of the report there is little carefully analysis of the specific clauses of, or judgement made under, any of these instruments. Thus, the discussion allegedly on the legality of ID cards – clearly a very important issue – has no reference to any law or case law; therefore, the judgment (p. 25) that it is “almost certainly illegal” appears based on little more than the authors’ own personal antipathies (whether well-founded or not).

    In any case, even when law/legal opinions are mentioned, the authors generally favour the most extreme findings (usually from the EU Working Party 29 Group (which has no jurisdiction to issue binding opinions even of EU law)) of the most extreme instrument (the EU Data Protection Directive – which is not directly applicable in UK law) as against a proper analysis of legal judgments made under UK law. So, there is no mention as far as I can see of the vitally important case of Durant (2003) where the Court of Appeal did the best of very badly drafted law by significantly limiting the meaning of “personal data” under the Data Protection Act (1998) (see http://www.hmcourts-service.gov.uk/judgmentsfiles/j2136/durant-v-fsa.htm). Nor, despite Douwe Korff’s claim that “oddly, the Information Commissioner feels it not his duty to apply [the EU Data Protection Directive]”, has anyone mentioned that the Information Tribunal (the first legal body which can correct the Commissioner) actually labelled his understanding of “personal data” as “absurd” precisely because it has followed to far too greater extent the highly problematic, over-broad and impracticable definitions devised in Brussels and Strasbourg as opposed to the more careful and restricted ones crafted via case law in our own courts (http://www.informationtribunal.gov.uk/Documents/decisions/harcupFinalDecision_050208.pdf).

    Even the Information Commissioner himself has belatedly recognized that the EU Directive is “not fit for purpose” and is “increasingly seen as out of date, bureaucratic and excessively prescriptive” (http://www.vnunet.com/vnunet/news/2220913/european-directive-fit-purpose). It needs radical reform and, as it stands, poses a significant threat to legitimate activities including those which engage fundamental rights such as freedom of expression and information. For example, according to the European Court of Justice’s decision in the Lindqvist case (2003) the Directive applies in full force even to private and non-commercial activity carried out on the internet (see http://curia.europa.eu/jurisp/cgi-bin/gettext.pl?lang=en&num=79968893C19010101&doc=T&ouvert=T&seance=ARRET). This draws into question the compatibility with the Directive of much social networking and blogging activity such as the publication of photographs. In this context, it is interesting to note that at least one of the authors of the Database Report has himself published a range of photographs including of street scenes etc. on Flickr (see http://www.flickr.com/photos/guppiefish/). How does all this relate to a fine-tooth combed compatibility analysis with the Directive especially when no Data Controller notification has been lodged and maintained with the Information Commissioner?

    When UK law is properly analyzed it is clear that there is, in fact, no proper legal problem with many of the issues noted as “almost certainly illegal” in the report. For example, even though both the Summary Care Record and the Secondary Uses Services may be processing sensitive personal data without express consent there is a clear saving for this in Schedule 3 of the Data Protection Act in relation to both healthcare management and medical research. Thus, Clause 8 (2) of this schedule provides that such processing may take place without any consent for the “purposes of preventative medicine, medical diagnosis, medical research, the provision of care and treatment and the management of healthcare services”. Moreover, especially given its origins in the Directive, the Data Protection Act (1998) (and certainly its current interpretation) may in some cases be too restrictive and, therefore, open to ECHR challenge on the basis of freedom of information and expression (Article 10) and even possibly the right to life (Article 2). This unsurprisingly is not even mooted in the report. However, for an interesting take on this as it relates to restrictions on medical cancer research (which could cost many lives) see “Top cancer expert, 91, ‘I’ll go to jail for science’” at http://www.timeshighereducation.co.uk/story.asp?storyCode=187013&sectioncode=26.

    My comments here should not be taken as suggesting that there is no problem with any of the Government’s initiatives/proposals or that none of them pose a disproportionate intrusion into privacy which should be both critiqued and opposed. What should be clear, however, is that enactments such as the Data Protection Act and the EU Data Protection Directive as currently drafted cannot be used as the yardstick with which to judge this issue. Instead, we desperately need a much more careful and finely-grained analysis which takes seriously the various other values and interests at stake within this debate.”

    David Erdos, Katzenbach Research Fellow, Centre for Socio-Legal Studies, Oxford

  4. Dear Douwe Korff

    Thank you for going to so much trouble to provide more detail on the sampling and the legal standards applied.

    You and Martyn seem to know the basis on which the databases were sampled, but I don’t believe they were spelled out in the report. This led reporters to over-generalize the results. From your perspective, this does not matter, because you believe the authors looked at some of the major databases. I disagree. I would be okay with any sample, as long as I know exactly what that sample is, how a sample was chosen, so that I as a reader or academic or reporter can judge the merit of any generalization based on that sample.

    Nor were the criteria for judging whether a database was “almost certainly illegal under human rights or data protection law” (of course this category needs work: it is ambiguous and double-barreled). In my opinion, you should not have to refer me to a number of previous reports published by fipr in order to discover the criteria applied in the report. Shouldn’t the criteria for rating the sites be spelled out in the report that rates them?

    In any case, the criteria you spell out appear to relate to law (“too vague”) or behaviors (“sharing of sensitive personal data”), not to databases. Do I not need to know how the data is used in particular circumstances? I think the Data Protection Directive notes a number of exceptions to general restrictions on the processing of personal data, for example.

    I appreciate the complexity of the topic, data protection law, the application of human rights law to particular cases, etc, which is why I felt that this report should have been more detailed in explaining what was done, and that the authors should have done far more to clarify the limitations of the conclusions.

    Thanks again. Sorry not to address every point you make. Perhaps I am wrong. Perhaps I am not a careful reader. But I always try not to blame the reader if I fail to communicate key points. In any case, I am not being critical of you, as one of the advisers, or the co-authors. I am being critical of the report.


  5. I advised the report’s authors on european human rights and data protection standards, which were used to rank the various databases as likely to breach those standards, possibly violating them, or broadly legal.

    first a preliminary point. you criticised the claim that “a quarter of all government databases are illegal”. but that was not what the report said. it said that “a quarter of the public-sector databases reviewed are almost certainly illegal under human rights or data protection law” (executive summary, p. 4 of the hard-copy version of the report). so the discussion about whether this is a sample that can be extrapolated is beside the point. the databases that were reviewed were not chosen because they necessarily were representative of all government databases but because they were the most important, major databases, affecting many millions of people in the uk. that seems to me to be a perfectly good reason for looking at them. and the conclusion that so many of these major databases are likely to breach european law (that should be complied with by the uk) is valid, and important.

    you havent commented on that conclusion in respect of those databases as such, but i may still say something about the legal standards that were applied. as the report notes (on p. 41), these standards were set out in some detail in an earlier FIPR report, for which i was a co-author:

    * children’s databases – safety and privacy, information commissioner’s office, november 2006.

    they had also been discussed detail in another fipr report, also for the ICO:

    * privacy and law enforcement – information commissioner’s office, september 2004 (my co-author was your colleague at OII, dr ian brown).

    they are also briefly summarised in a more recent article i wrote:

    * the need to apply UK data protection law in accordance with European law, Data Protection Law & Practice, May 2008.

    i hope i may refer to these for details of the legal standards concerned. briefly, under the european convention on human rights, the collecting, storing and sharing of (often highly sensitive) personal information by the state constitutes an “interference” with the rights of the individuals concerned to “respect of their private life”. that right in the convention (art. 8) is increasingly interpreted in line with more specific standards relating to the processing of such data, contained in special data protection instruments, notably the council of europe convention on data protection (convention no. 108) and the ec data protection directives (in particular the main dp directive, directive 95/46/ec). between them, the echr and the COE convention and the EC directive lay down very clear, strict standards that are well-known to data protection specialists (and the government) – although, oddly, the information commissioner feels it is not his duty to apply them, and the government rides roughshod over them.

    the first requirement is that all such collecting and sharing etc. of personal data by the state must be based on “law” – and that term is interpreted as requiring detailed, clear and precise legal rules that are foreseeable in their application. by contrast, in the uk, the government often relies on vague, catch-all phrases (known as “vires”- or “gateway”-clauses) to justify data sharing. in the echr case copland v the uk, the strasbourg court made clear that such clauses cannot be relied upon. in another strasbourg case, i v finland, the court held that medical databases must be structured in such a way as to prevent access to patient data by people not involved in the patient’s treatment. the uk NHS databases do not ensure this. also under the echr, to the extent that a (strict and clear) “law” allows data processing and -sharing, the actual processing and sharing must still, in each individual instance, be “necessary” and “proportionate” in relation to a “legitimate aim”. this is not always ensured. in marper v uk, the european court of human rights held that the uk rules on retention of dna data breached the convention.

    under ec data protection law, data subjects must be adequately informed, and in principle consent to the processing of their data (unless a “law” overrides this because there is a special, overriding, important public interest which “necessitates” the processing of the data without consent). the most authoritative eu body on matters of data protection, the so-called “article 29 working party” or WP29, has held that secondary uses of patient data for research purposes require consent. eu law is very demanding on what constitutes free and informed consent. the uk data protection act manifestly breaches the ec directive by including “research” into the concept of “medical purposes” when this is not so in the directive, and the rules on the use of patient data for research clearly violate the directive.

    the use of “profiling” to identify children (and soon, adults) who will “probably” become criminals is also likely to be unlawful. there are many such examples.

    i hope the above shows that the assessments that were made were not made lightly.

    i (and the main authors of the report) would of course be more than happy to discuss our findings, the standards that were applied, and our conclusions, with anyone willing to read the report carefully.

    yours sincerely –

    douwe korff

    when it comes to secondary uses of patient data, for research or administrative purposes,

  6. Bill – perhaps you could comment on the report, rather than the headline of an article about the report?

  7. Martyn,

    Even if I would accept that every database claimed to be in violation of privacy (by design) is indeed in violation, I still see no evidence to justify claims made about these representing a ‘quarter of UK databases’. You say you have some conception of the population of databases from which the 46 are drawn, but I did not see that population described in any detail in the report. I think we have to keep claims anchored to the evidence, and avoid or distance ourselves from claims that go beyond the evidence. My apologies if my comments appear to casually dismiss the report. I take the report very seriously, as it has generated such media coverage. Because I am concerned about privacy and data protection, I am concerned when the credibility of these concerns are put at risk, such as if we over-generalize from a selective set of cases.


    Similarly, I am not dismissing the likelihood that civil liberties issues abound in particular areas, and hope that attention can be focused on areas of greatest concern and risk. If we think that a quarter of UK databases are violating our privacy, it could well be counter-productive to efforts at addressing the most pressing ones.

    Thanks for your comments. I realize these are hot button issues.


  8. Databases described in the Rowntree report detail the types of case history and ‘professional judgment’ one might find in highly confidential mental health casenotes, but which have been copied and recopied and distributed to governors, Inclusion officers, Panel members, administration workers, and those who shred documents after Panel meetings. The language in the form explicitly invites this type of comment on ‘behaviour’, ‘attitudes’ and ‘self-esteem’, three of the ‘extended body’ ‘attributes’ I have described as vulnerable to drawing descriptive attention.

    Whilst it might feel sensible to assess and target-fund the needs of young people who have been identified as suffering ‘from emotional problems’ or to re-educate young people who display ‘discriminatory attitudes towards others’, the ONSET form, for example, does represent significant civil liberties issues. For example, would an adult be happy for a referral form to be completed stating that he or she ‘does not use spare time constructively’ and therefore needs to be referred for diversionary activities? The procedure looks like something George Orwell (1949) might have written about in 1984, or close to the activities of the ‘Precrime’ team on those who have featured in predictive dreams of violent attack in the Spielberg (2002) film interpretation of a Philip K Dick short story, Minority Report. Fundamental to the problems with this type of information-gathering exercise is the fact that the people entering the data are not necessarily qualified to make the judgments they are recording, and this fact is not commensurate with the validating power of an ‘opinion’ once it has been captured as ‘fact’ on an official form.

  9. Bill,

    Although I am a member of the FIPR Advisory Council and know most of the report’s authors (as you do), I had nothing to do with the report before publication, so I am commenting from the same position as you.

    Your criticisms of the report don’t seem to justify your agreement with the MoJ. Indeed, I find them startlingly complacent.

    You question how the 46 databases were selected. The report says (p4 – add 2 to get pdf page numbers) “This report surveys the main government databases that keep information on all of us, or at least on a very substantial minority of us, and assesses them using a simple traffic-light system.” and (p11) “So the first problem is one of scope – what is the ‘database state’? A narrow view would be to consider only those systems that hold information on most citizens (tax, NHS records, driver licensing, …). We have taken the broader view that we will cover those systems that will at some time or another hold identifiable personal information on at least a significant minority of citizens. We therefore include children’s databases and pensions. We include criminal justice, as about a third of men will acquire a criminal record at some time in their lives.13 We also cover systems that have been announced but not yet built, such as the National Identity Register and the proposed ‘Interception Modernisation Programme’ communications database.” Can you identify any public-sector databases that meet this definition and that were omitted? If so, I expect the authors would willingly add further assessments. If not, what is the basis for your criticism?

    You say “The report does not explain its methodology or the nature of the evidence on which the authors draw their conclusions.”. The report contains 222 references to show where they obtained the information on which they based their conclusions. To what degree is this insufficient? The report also defines the traffic-light classification scheme that the authors used; for example, RED is defined (p5) by “Red means that a database is almost certainly illegal under human rights or data protection law and should be scrapped or substantially redesigned. The collection and sharing of sensitive personal data may be disproportionate, or done without our consent, or without a proper legal basis; or there may be other major privacy or operational problems.” The problems with each database are then explained in the body of the report. The report could not prove that all the Red databases were illegal (that is a matter for the courts) but one of them (the DNA database) has been found to be illegal by the highest court so it seems to me to be reasonable to argue by analogy.

    In my opinion, the report makes a strong case that many of the databases are at risk of being found illegal and that many compromise privacy disproportionately. I am surprised that you chose to dismiss the report so casually, when the consequences could be serious.

    Martyn Thomas CBE FREng

Comments are most welcome