Menu Close

Better use of data would pull intensive care out of the 80s

ICUs gather a lot of data, then throw it away. Norbert Kaiser

If you’ve ever been to an intensive care unit you may have noticed that they are full of monitors. Each one is critical in examining a patient’s health, from the electrical conduction of their heart to the oxygen concentration of their blood. What’s not wondered, and very much should be, is, what happens to all the data produced by these monitors?

In the past, there was nothing to do with it. Massive hard drives didn’t exist. The cloud didn’t exist. But now that technology allows us to store huge amounts of data, the information can no longer be thoughtlessly discarded. Now the data is being wasted. Wasted data is a wasted opportunity. Proper use of this information can teach the systems the dynamics associated with patient degeneration, alerting the attending physician when it is detected. That represents a considerable coup in the fight to free up the time of hospital staff while maintaining services for patients.

More data means better care

As far back as 1981, physicians and data scientists at George Washington University published the Acute Physiology Score. This was a severity of illness score aimed at the general intensive care unit population. The hope was that this score could act as a triage metric of sorts. Upon arriving in an intensive care unit, clinicians could be directed to those with a higher score first, ensuring care is delivered more quickly to those who need it the most. However, data was sparse and difficult to store, and so a compromise between model complexity and efficacy had to be made. What resulted was a system that was not sensitive enough to be used on an individual patient.

Since then, the idea that algorithms could never be used to assess the health of an individual patient stuck. Technology has evolved, as have methods of assessing physiologic measurements from an individual patient, yet the methods used have scarcely changed since 1981.

Modelling the severity of illness using algorithms can help doctors and families make decisions about end-of-life care. If a clinician is able to tell a patient’s family with confidence that, unfortunately, it is extremely unlikely that a patient will survive even with radical treatment, it eases the discussion about palliative care. This side of healthcare is an emotive subject and there is, understandably, a sense that human doctors are best placed to deal with distressed families. But doctors regularly base their assessments on past experiences and the best data available to them on a given condition. Providing them with detailed information based on algorithms therefore doesn’t have to replace the human touch, it can simply make for more informed decision making.

What really makes a good hospital?

This type of monitoring can also make assessing hospital performance much more realistic, and in the past, this was their primary utility. Intuitively, looking for a hospital which provides the “best” care was thought to be as simple as finding the one with the lowest patient mortality. However, this neglects the demographics of the patients admitted to the hospital, often referred to as the case-mix.

While one hospital may cater mainly for the terminally ill, another could be a hotspot for trauma accidents from a nearby highway. The mortality rates from these hospitals will differ wildly, but this is independent of the level of care provided at that institution. Risk-adjusting is the practice of estimating, given the patients admitted, how well a hospital should perform on average, and performance is often measured as the percent mortality or length of stay of patients admitted. If the hospital performs better than expected, you could then look for beneficial policies which exist in that hospital and disseminate these to other care providers.

Embracing the black box

For all these reasons, care providers need to think more about the opportunities presented by the automated monitoring of patient health. Though doctors have traditionally not been comfortable using “black box” methods in care, more and more studies are being undertaken to evaluate the improvement of care possible when leveraging the volumes of data recorded in the hospital. These studies range from early alerting of physicians to patients requiring rapid response therapy to prediction of mechanical ventilation weaning time. Ensuring that these studies transition from the exceptions that they currently are to common practice requires the NHS and its employees to embrace the fact that optimal clinical care for the complex patient requires complex algorithms. Only then will we realise the true potential of data-driven health care.

Want to write?

Write an article and join a growing community of more than 182,100 academics and researchers from 4,941 institutions.

Register now