Antibiotic resistance is one of the greatest challenges facing mankind. We risk a future of common infections and minor injuries once again proving fatal – plus longer hospital stays and higher medical costs. Some infections are already no longer treatable with current drugs. Around 700,000 people die each year around the world as a result, and some studies predict 10m by 2050 – more than die from cancer.
To avoid this “antibiotic apocalypse”, everyone acknowledges we need to limit the quantities of antibiotics people are taking. One key strategy to achieve this is antimicrobial stewardship – putting systems in place in hospitals and doctors’ surgeries that restrict antibiotic prescriptions by paying more attention to the type, timing, dosage and duration of courses of treatment.
With the UK currently close to completing a five-year implementation plan across the health service, and various other countries also at different stages of development, stewardship is undoubtedly proving effective. There is growing evidence that interventions by managers improve best practice and reduce the length of time that patients spend on antibiotics, without increasing mortality rates.
When we analysed the data, however, it became clear that there are also important lessons that need to be learned. The wider effects of stewardship are not well enough understood. The majority of studies into the effectiveness of tighter antibiotic restrictions have only focused on their intended outcome – cutting the quantities of drugs being prescribed.
Few studies have looked at other consequences, and sometimes these are not easy to predict. Even interventions that reduce the use of antibiotics can lead to unwelcome effects elsewhere in the system.
Knowns and unknowns
Since many consequences from tighter antibiotic restrictions are predictable, it’s important we start monitoring them from the outset. Measures commonly involve, for example, requiring frontline medics to get prior permission from a more senior colleague to make sure they’re prescribing the right antibiotic.
Another example is introducing stop orders, which end a course of treatment on a particular date if the clinician hasn’t specified one from the outset. Steps like these can interrupt or delay treatments, but we know little about to what extent.
Some restrictions will inevitably be too unwise to justify. When patients are showing symptoms of infectious pneumonia, for instance, it is common practice to start them on antibiotics before the diagnosis has been confirmed. People who turn out not to be infected will sometimes end up taking unnecessary antibiotics. But since the risks outweigh the benefits with this kind of potentially life-threatening condition, this is difficult to avoid.
But if this kind of problem is foreseeable and needs to be exempted from any stewardship system, our research has also thrown up consequences that couldn’t have been anticipated. In 2009, for example, the Scottish government aimed to reduce by 30% over two years rates of the Clostridium difficile bug, which causes stomach pains, sickness and diarrhoea. This effort involved changing the type of antibiotic normally given to patients prior to various types of surgery to protect them from post-surgical infections.
One result was that more orthopaedic patients ended up developing acute kidney infections – ten more cases per month in one hospital. The managers setting up the stewardship system did not realise that stopping the antibiotic could lead to kidney infections in these patients. They ended up having to stay longer in hospital and needing more clinical interventions as a result.
Unexpected consequences can also be positive sometimes. One example is a study of over 10,000 babies thought to be at risk of sepsis, a potentially deadly infection in the blood. The study looked at whether dispensing with the routine diagnostic blood test on these babies and relying only on other clinical examinations delayed the point at which you could start those testing positive for sepsis on a course of antibiotics.
If so, it would mean they would need more antibiotics for a longer duration and that the blood test was therefore a necessary means of controlling levels of prescriptions. Instead, however, the study confirmed that it made no difference, and in fact meant the infants could be given antibiotics earlier – so reducing the need for prescriptions.
Pause for reflection
This hopefully gives a glimpse into the complexity in this area, and the limitations in simply looking at cause and effect. As part of our research, we have worked with practitioners around Scotland to understand how to monitor and predict consequences more effectively.
We’ve now produced a framework to help managers to identify risks from the outset. It promotes the idea of an “improvement pause” to review the new system after a few months and make any necessary adjustments – hopefully making all the professionals involved more confident that the changes are benefiting patients and families. Unpleasant surprises in particular need to be carefully evaluated to see if any harm being caused is enough to stop or adapt the intervention.
The point is that to protect patients, all outcomes associated with changes to antibiotic prescriptions need to be monitored carefully. We’re not seeing nearly enough of this happening after systems are put in place. While interventions are vital to protect us all from antibiotic apocalypse, they still need to be balanced against the needs of patients today.