Open health is the intersection between health care and information communications technologies. We’ve previously talked about what it is and why people should care about it, the problems that may occur, and examined some of those problems in detail. Here we will consider the oversight of open data systems.
The 2012 U.K. Open Data White Paper closely examined the need to maintain anonymity within and between data sets. Important among the issues identified was the “mosaic” or “jigsaw” effect, where data can be used to re-identify groups and individuals. This is obviously problematic for medical information because information about populations and genetics has previously been used to target particular groups.
It’s important to start identifying potential vulnerabilities in open data projects, including in open health, early and to review them frequently. The UK response has been a series of Privacy Impact Assessments to examine this and other possibilities of abuse of open health data.
Once linked, data can be very difficult to [unlink](http://www.smh.com.au/technology/technology-news/googles-privacy-policy-under-fire-20121012-27hgj.html. Indeed, data protection and securing anonymity lies in tension with a central driver of the open health movement, innovation. Ultimately, protecting anonymity will cost money and time, and this can slow the distribution of data to interested parties who can use it for practical ends. But any large-scale project pursued in the public interest should be cautious about the lure of constant innovation – more innovation is not the only thing that matters.
An important step in crafting good data oversight will be assessing the trade off between oversight and innovation. Open data projects are often driven by the practitioners who most want the data – and it stands to reason that these will be the people who will campaign for access. But we need to ensure that the important voice of the innovator doesn’t drown out the reason we innovate in the first place, which is arguably to improve lives.
Oversight requires observation – officials to monitor projects, implement policies, and conduct assessments such as privacy impact assessments. A common refrain when dealing with such officials is “who watches the watchers?” We think this is an important question, but should be supplemented with “who pays the watchers?”.
Open data in the United Kingdom, including open health, is [intended to be cost neutral]( http://www.dh.gov.uk/en/Consultations/Liveconsultations/DH_134221 for the government. This sounds fantastic in theory but is ludicrous in practice. The shift to eHealth records in the UK has so far cost £2.7 billion and as of 2010 was of no demonstrable use. The Victorian eHealth program HealthSmart was cancelled after it went over budget and time.
Open health is unlikely to be cost neutral in the short term, if ever. And where its financial support comes from is a cause for concern. The UK’s model is explicitly concerned with generating returns through private industry. On the face of it, this is not a bad idea – we should ensure that businesses are required to maintain the system they rely on to generate their profits. Essential funding oversight and quality service provision should be paid, in part, from profits made by the private sector.
The tension is the result of open health being purported to be in the public interest. It’s important to ensure that even as they maintain open health systems, powerful private actors don’t undermine the central goals of the systems in the process of pursuing their own agendas.
Examined in this light, bringing industry into the realm of oversight seems problematic at best. Money is already a problem in health care and medicine – pharmaceutical companies sponsoring trials have been accused of wrongdoing leading to the deaths of vulnerable people, and the oversight of medical research has been corrupted time and time again. So we need to be cautious about the ways that we allow industry participation in oversight.
Oversight is necessary to prevent a range of harms from the practice of open health. But oversight carries its own problems – bad implementation, mismanagement, and corruption are all risks.
Careful and honest reflection on the tension between the values that drive open health is required before the projects start. On the one hand, human health is vitally important, but we only have so many resources to go around. On the other, innovation will help with this resource problem. In between are the lives and personal information of whole societies, released into the open on the premise that doing so benefits everyone.
Making sure that open health really does benefit people, and the right people, is a task for the next few years. Open health is either here, or coming to a country near you. We should all inform ourselves and each other about how these policies are implemented, and what that means.
This is the final part of Open Health, an intermittent series on the role of open health in shaping the future of Australian health care. Previous articles from the series are linked below.