Another Fail for Electronic Health Records, Another Lesson About Implementing Guidelines

By Chuck Dinerstein, MD, MBA — Feb 11, 2022
Electronic Health Records, those over-hyped, energy and time-sucking billing algorithms sold to the American public and healthcare professionals as THE answer continues to search for some valuable clinical role. Clinical decision support are algorithms that scan our data and send timely reminders to our physicians to “Do the Right Thing.” While it may work well in the Ivory Towers, and even that is debatable, when taken out for a real-world test, clinical decision support is nowhere near ready for prime-time.
Image courtesy of Peggy_Marco on Pixabay

Before moving on, we have to give real credit to the authors of this study which, spoiler alert, shows no benefit to a clinical decision support tool. Even they can find little positive to report, but they published significant negative results, which remains a rarity. Now, on to the “science.”

The researchers took a clinical decision support (CDS) tool that had worked well in the academic setting where it was developed and applied it to 70 community health centers using the same electronic health record from EPIC [1]. The CDS provided timely reminders to healthcare personnel of cardiovascular guidelines not being met by the patient in front of them. The idea is that the reminders would prompt the physician, nurse practitioners, and physician assistants to “get with the guidelines.” [2]

The CDS, also named the “CV Wizard,” identified patients age 40 to 75 with at least one uncontrolled cardiovascular risk factor or an estimated “high reversible risk.” When the wizard identified such a patient, it would alert the clinician, who in two clicks could view and print personalized care recommendations to reduce the cardiovascular risk of the patient in front of them. The hoped-for outcome was a reduction in total and reversible risk a year later. Total risk is the absolute change in risk factors for any given patient. Reversible risk is a calculation of how those behavior changes reduce adverse cardiovascular outcomes over the next 10 years. 

  • Forty-two community health centers across eight organizations with 11,159 patients served as the treatment group.
  • Twenty-eight centers from seven organizations with 7,419 patients served as controls.
  • Mean age 58.8, slightly more women than men, 26% Hispanic, 18% Black, 45% White
  • More urban centers were in the treatment group and fewer White patients
  • The CV Wizard was used 34.7% of the time on initial evaluation. Overall, across 92,000 clinical encounters, it was used 19.8% of the time. The reminders appeared automatically and only required two clicks to print and share them – not an exceptionally high hurdle.
  • The CV Wizard did not alter risks for either the intervention or control arms in the overall population. There was evidence that those at greater risk had significant improvement in their 10-year risk in the control group. The same held for reversible risk; the control group did better. 
  • In those patients where the alerts were viewed or printed, those at greatest risk (more than 20%) did see risk reduction (1.3% compared with 0.3% in the controls). If the CV Wizard was used more than once, then the 10-year risk among the treatment group improved by 1.7% compared to the controls at 0.1%.

“Although this risk reduction was modest (absolute improvement of 4.4% vs. 2.7%), if maintained over time it could represent a population-level reduction in cardiovascular events.”

The most important lessons can often come from our failures. There are two that jump out and they are entangled. The CV Wizard was used 80% of the time in the institution where it was developed; in the community health centers, at most 33%. More importantly, the alerts went to the screens of the “rooming staff,” those individuals that bring patients from the waiting area into the examinations rooms and do the basic measurement of vital signs – nurses’ aides, care technicians, rarely a clinician. So, the alerts rarely made it to their intended audience, the physicians, nurse practitioners, and physician assistants. This would seem to be a "duh" mistake, but it speaks to how different each community health center was in its workflow compared to the ivory tower. 

“Factors that affect point-of-care CDS use include workflow integration, competing clinical demands, number of clicks to access the CDS, and clinician confidence in the validity of the advice provided. …The present study included numerous care organizations in which heterogeneity in rooming protocols impeded training and sustained high CDS use.”

That is how academics acknowledge that every care setting differs in its workflow and has specific abilities and limitations to its resources. One size will not fit all. Improving the health of our patients is more like hand-to-hand combat than carpet bombing. The magical belief that a clinical decision support tool can be developed in academia and then quickly applied to the real world is a fantasy. We see this in our response to COVID, just as we see it in this study. The difference, of course, is that we do not ascribe political beliefs to the results presented in this study. We do, for COVID, and do so to our detriment.

[1] Epic Systems Corporation, or Epic, through its health record software, controls more than half the patient records in the US. They are the predominant vendor of electronic health records. i

[2] Parenthetically, the same administrators that bought these EHRs to improve their bottom lines like CDS systems because they can be tuned to meet the metrics of insurers and CMS and get hospital systems additional rewards for care, like checking Hemoglobin A1c levels. 

 

Source: Effect of Clinical Decision Support at Community Health Centers on the Risk of Cardiovascular Disease A Cluster Randomized Clinical Trial JAMA Network Open DOI: 10.1001/jamanetworkopen.2021.46519

Category

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: