Andrew Lacy
Soapbox

American Health Care Has a Diagnostics Problem

By Andrew Lacy
Andrew Lacy

The key priority for any healthcare system should be the well-being of patients. At present, the U.S. healthcare system is failing to target low-hanging fruit that could help drive better results for patients, and—by catching illnesses earlier—ultimately lower the cost of our healthcare system.

As everyone knows, America has among the world’s highest healthcare costs, spending $4.1 trillion per year—about $12,500 per person, or three times the OECD average. But another aspect of our health system gets far less attention: our focus on reactive treatment instead of proactive diagnostic and preventive care.

The truth is that our health system is currently designed to treat existing conditions, rather than to minimize or prevent future health problems. People who lack insurance often rely on emergency services, which by definition treat immediate health problems. Even people with excellent insurance, though, often find it’s far easier to get treatment than preventive care.

That’s a problem, because the innovations that promise to have the biggest impact on health outcomes in coming years largely focus on diagnostics or depend upon early detection to turn serious or terminal illnesses into manageable conditions. To put U.S. healthcare back on the right track, we need to focus not just on lowering costs and ensuring access to care, but also on changing the kinds of care that people have access to.

The Economics of Diagnostics

There are more than 4,000 different tests on the market today that can detect illnesses early, before they become life threatening. From blood tests that detect fragments of cancer DNA, to advanced imaging that spots early-stage Alzheimer’s based on brain-volume changes and retinal scans that can reveal Parkinson’s, we now have a plethora of tools at our disposal to improve health outcomes.

Unfortunately, though, systemic barriers are holding back adoption of these new technologies. One is simple economics: insurers are profit-driven, and there’s limited utility for them in spending money now to prevent health problems in a year or a decade’s time. After all, with 55% of Americans considering leaving their jobs, the patient might well have changed their insurance company by the time a health problem emerges.

The result is, diagnostic care tends to wind up overlooked and underfunded. Once you spot an illness, you have to treat it — potentially at enormous cost over a period of years or decades. If you allow a chronic condition to go undetected until the symptoms become obvious, you might only have to pay for a few months of relatively cheap palliative care.

Sound cynical, or even shocking? Welcome to the U.S. health system. It took insurers decades to cover mammograms following their development in the 1960s, and it wasn’t until legislatures issued mandates that women finally gained consistent access to this basic and highly effective diagnostic tool.

Barriers to Availability

But diagnostic and preventive care isn’t held back solely by insurers’ self-interest. Doctors and physicians are often resistant to the broad rollout of diagnostic tools that can be used effectively by nonmedical professionals or even by patients themselves. While there are legitimate reasons to ensure that clinical decisions are made by properly trained and licensed professionals, this friction serves to make diagnostics more costly and harder to access for many Americans.

In a similar vein, it can take years, if not decades, for regulators to sign off on new technologies—and the wait is getting longer. From 2008-2013, it took regulators an average of 83.1 months to approve new medicinal technologies. Between 2014-2018, that figure had grown to 89.8 months.

Why do regulators take so long to give medical innovators clearance to bring their products to market? It’s partly because they allow perfect to become the enemy of good. A new diagnostic intervention might offer a clear path to spotting illnesses early and saving lives, but it will still have to go through years of data-gathering and additional research before it can secure regulatory approval.

That’s frustrating, because diagnostic and preventive tools typically carry few risks. It makes sense for a surgical implant to get significant scrutiny, for instance, because if it goes wrong people’s lives are at risk. A diagnostic measure, on the other hand, is typically safer to use: the bigger question is whether it’s accurate. That isn’t a trivial concern, but it shouldn’t become a roadblock to adoption, either.

What Are the Solutions?

To put things right, we need to find ways to drive adoption of a new generation of diagnostic tests. That will require rethinking the way we consider scientific evidence when determining which medical technologies to approve and deploy in our hospitals, labs and clinics.

At present, regulators use “evidence-based” criteria to figure out what to approve. That essentially means that treatments must be proven safe and effective, based on exhaustive study. That’s an important approach that’s entirely suitable to medical innovations where there’s a significant theoretical risk to the patient. For instance, before approving a systemic drug, we need to be completely sure it doesn’t have serious side effects.

But it doesn’t make sense to hold diagnostic and preventive measures that don’t have the potential to harm patients directly to the same standard. Instead, we need to move toward an “evidence-informed” approach; still anchored in scientific research, of course, but designed to give physicians and caregivers more leeway to use their own judgment about what’s in a patient’s best interest.

It follows that a new scan that can detect cancers while they’re still treatable might take years to win approval under an evidence-based regime. To truly understand the benefit of early disease detection would require following test and control populations, possibly for decades. It might seem obvious to you or me that catching lung cancer at Stage 1 versus Stage 4 would be of benefit, but proving that requires understanding more about the effectiveness of early treatment versus late treatment, what conditions people might otherwise have died from if not lung cancer and, if insurance coverage is desired, whether it is cost effective for the health system.

An evidence-informed approach, on the other hand, lets doctors rely on understanding the mechanism by which an intervention functions and then use common sense intuition as to what benefits might accrue to a patient as a result. Instead of waiting around, they could move quickly to adopt such tools and start saving lives.

Patients Come First

The key priority for any healthcare system should be the well-being of patients. At present, the U.S. healthcare system is failing to target low-hanging fruit that could help drive better results for patients, and—by catching illnesses earlier—ultimately lower the cost of our healthcare system.

Diagnostics and preventive medicine are a core part of the healthcare process. It’s time to start working mindfully to eliminate economic and regulatory barriers, and to ensure that we’re using new medical technologies effectively to optimize outcomes for our patients.

About The Author

Andrew Lacy

Leave a Reply

Your email address will not be published. Required fields are marked *