Menu Close
User agreements are often long, complex and inaccessible texts that don’t help users understand what exactly is being done with their information. Shutterstock

Plain language about health data is essential for transparency and trust

Ask any group of people if they have read and understood the last terms and conditions they agreed to, and you’ll be lucky if anyone puts up their hand. This is not surprising given the estimate that it would take 76 work days for someone to read the privacy policies they encounter in one year.

Somehow we have got to a situation where terms and conditions for apps, websites and online services have become so long and complicated that many people have given up the hope of understanding what is happening with their personal data. Meanwhile, news stories from around the world, including the United States, United Kingdom, Denmark and Australia, make it clear that people care about how their data are used, particularly health data which are very personal and sensitive.

It doesn’t have to be this way. In our publication “Notches on the dial: a call to action to develop plain language communication with the public about users and uses of health data” in the International Journal of Population Data Science, we present a plan to work with the public on simple communications about health data.

What people care about

We’re not starting from scratch. Qualitative research studies, performed by our team and others, provide information about what people care about when it comes to health data. For example, it’s important that privacy is protected, that there is a public benefit and that the findings from health data research don’t disadvantage any groups.

There are also published frameworks about how to provide access to sensitive health data responsibly. The Five Safes framework prompts people to think carefully about:

Safe Projects: Is there scientific merit? Is there public value?

Safe People: Who is using the data? What training do they have?

Safe Data: How potentially identifiable are the data? Is there consent? Is there legal authority for use?

Safe Settings: Where will be the data be analyzed? How will they be managed?

Safe Outputs: Is there any potential disclosure, either of individuals, families or communities?

The WHY? WHO? WHAT? HOW? questions of the One-Way Mirror Report — prepared for the Wellcome Trust, a charitable health research foundation based in the U.K. — focus on what members of the public care about when health data are used by companies.

Improving communication

Bringing all of this information together, our team has started to develop simple text for communicating about health data. We want to work with the public to create some standardized text that helps people understand what is happening with their health data. We also want the text to distinguish between different uses of health data to ensure that members of the public do not confuse, or group together, commercial revenue-generating uses with academic health research.

Plain language texts ensure that members of the public are able to understand what their health data is being used for. Shutterstock

Organizations would still need to provide detailed text like Google’s 27-page privacy policy or the Apple’s transparency reports, but complementary simple text would be helpful for people who may not have time or interest in reading all the details.

Also, noting that longer isn’t always better, the short standardized text could include information about third-party uses instead of having generic references to the possibility that other organizations might use data.

For example, imagine how much easier it would be for people to understand what is happening with their data if a fictional commercial organization called ABC had text like:

At ABC we use your data to improve our products and services.

Less than 100 of ABC’s 3,000 staff have access to identifying information such as your name and address; other staff at ABC work with pseudo-anonymized datasets that don’t include names or other identifying information.

We earn five to 10 per cent of our annual revenue from the data we hold. In some cases, we provide identified data to other companies which includes your name and contact information, most of the time we perform analytic services for other companies and provide them with summary statistics. We invest approximately half of the revenue we earn from data in maintaining our databases and ensuring the privacy and security of data holdings.

University researchers with Research Ethics Board Approval also have access to pseudo-anonymized data that doesn’t include identifying information which are held in a data trust.

For information about which uses of data that you can opt out of, and how to opt out, click here.

Our full privacy policy is available here.

Transparency and trust

Many governments are emphasizing the importance of trust and consent for data use. Canada’s Digital Charter and the General Data Protection Regulation in the U.K. are two examples.

But it is our view that you can’t have trust without transparency, and you can’t have transparency and informed consent without plain language. Our goal is to work with members of the public to co-develop simple plain language text that helps people make informed decisions about how their health data are used. This can be an important step toward deeper involvement of the public in health data rules and policies.

[ Expertise in your inbox. Sign up for The Conversation’s newsletter and get a digest of academic takes on today’s news, every day. ]

Want to write?

Write an article and join a growing community of more than 180,900 academics and researchers from 4,919 institutions.

Register now