The US Food and Drug Administration (FDA) recently revised the safety labels for statins, widely used anti-cholesterol drugs. They mandated label warnings saying statins may increase the risk of type 2 diabetes and loss of memory. Hysteria followed.
Almost immediately the blogosphere and inter tubes were filled with worries about the drugs. But how worried should we be and how do we adequately convey the risks involved?
To put things in perspective, what went almost unreported was that a warning was downgraded. The chance of getting liver failure from statins is now considered to be extremely rare, so if anything (even with the new warnings) statins are safer than we previously thought.
But why have the warnings changed? Surely drug testing should have picked up safety issues long before now?
Long road to approval
Before any drug is approved, it has to undergo extensive testing for both effectiveness and safety. Firstly, in “test tube” and animal models, then in human trials. Typically, in the trials leading up to approval, thousands of patients will be exposed to the drug. Low-frequency side effects that couldn’t be picked up in earlier safety trials or animal experiments should be picked up at this point.
But what if the side effects are very rare, less than one in 100,000, say? Rare side effects can’t be picked up in any practical-sized human trials (there may not even be 100,000 people with the condition in your country), so we have to rely on looking at the drug carefully after approval to catch them.
As our experience with a drug increases, we can refine our estimates of the risks involved. In the case of liver damage with statins, long-term surveillance suggests the risk is much rarer than we thought, roughly around two cases for one million patients.
In the case of the risk of diabetes, there was conflicting data. Some trials suggested a small decrease in diabetes, while others suggested a small increase. To try and resolve this, a meta-analysis of 13 trials was performed. In a meta-analysis, data from trials with similar methodologies are pooled to give a higher statistical confidence than the individual trials can achieve.
Overall, it seems that there’s a small increase in diabetes. But it is small – of 91,104 people taking part in these trials 2.3% developed diabetes in the absence of statins, while 2.5% developed diabetes when taking statins.
This is sometimes reported as a relative risk of 9%, that is, the risk of diabetes is increased by 9%. If the risk of diabetes in the general population is 2.3%, then an increase of 9% means that the risk of getting diabetes in statin users is 2.5% (multiply 2.3 by 0.09 then add it to the 2.3 again).
And the association may still be a statistical fluke. Or a consequence of more people surviving (if you live longer you will have a greater chance of getting type II diabetes since your chance of getting it increases with age), rather than statin-induced damage. More studies are needed to distinguish between these possibilities.
Still, diabetes is serious, and it’s better to alert people just in case the association is real. While the benefits of statins are well established, with a 25% risk reduction for death or serious heart damage, people need to be informed of potential risks when therapies are prescribed. Importantly, people with diabetes are at higher risk of heart disease, and statins reduce that risk.
The perils of communicating risk
The problem in this case was one of communicating risk. Humans are very poor at estimating risk, and we tend to overestimate them, especially when they require balancing probabilities.
The way risk is reported often doesn’t help. Media reports gave very little guide to how much risk there was, and often gave the impression the increase in diabetes was substantial – “Thousands at Risk” is a headline guaranteed to cause worry.
The FDA announcement wasn’t much more helpful. Despite reassurances that the benefits outweighed the risks, you had to navigate deeper into the FDA site to get some idea of the real magnitude of the risk, and the information was still not presented in a way that non-medical people could easily grasp.
This is no easy task. Which description conveys the risk of liver damage to a patient taking statins? Saying it’s a one-in-500,000 chance, or saying it’s very rare? The first is a nice precise number, but people are very bad at evaluating these sorts of numbers.
We tend to significantly overestimate risks, especially if we have no control over them. Would it be better to say that for every person who develops liver failure, 9,804 people avoid death or a crippling heart attack?
There are many studies on risk perception and risk communication. Their findings should certainly have been used in this case. What was needed was a simple, plain language statement that the risk of liver failure was lower, and the risk of diabetes was slightly increased, with links to clear, plain language explanations of how those risks come about.
It would also help if the media could be more careful with their headlines.