By Diane Evans
Publisher, MyHIPAA Guide
If we “fact check” presidential candidates, why not also fact check Medicare? After all, Medicare’s newly announced physician-reimbursement plan will affect the health care coverage of more than 55 million Americans and will determine the kind of treatments that Medicare beneficiaries receive.
If you are enrolled in Medicare, this new paradigm means your particular treatment may be determined by statistics, presumably showing what has worked best for others like you. Of course, the success of such a data-driven approach depends on the quality of the data.
So, the very first question is: In moving away from fee-for-service payments to doctors in favor of so-called “value-based care,” will Americans benefit as Medicare promises? Let’s start fact checking!
In an open letter accompanying last week’s announcement, acting Medicare chief Andy Slavitt describes the government’s new approach as a “more modern, patient-centered program. . . promoting quality patient care while controlling escalating costs.’’ He further notes that by healthcare providers working cooperatively with Medicare, “we can all make real progress in improving the delivery of care in our country.”
In fact: The new system will award higher pay to doctors who base their medical decisions on “best practices” determined by statistics. However, health IT experts—including some in the federal government—warn that the technology simply isn’t available yet to do the high-quality data analysis necessary to standardize patient treatment plans. By the government’s own estimate in a report titled “Capturing High-Quality Electronic Health Records (EHR) Data to Support Performance Improvement,” the nation won’t have the advanced technology for such comprehensive data analysis until 2024.
In an Executive Summary explaining the new system, Medicare says that doctors can qualify for higher pay based in part on dispensing patient care according to models developed private insurers or Medicaid programs. This is presented as a means of achieving higher quality care.
In fact: In a 2011 study, McKinsey & Company refers to transparency as a key precondition to improved healthcare delivery. Yet commercial entities (such as big insurers) don’t readily reveal data for the sake of transparent analysis of best health care treatments. Says McKinsey: “Even in the United States, where health care data is abundant, political and commercial considerations have hindered attempts to use public reporting to drive outcome improvements.”
In addition, the government’s own report, “Capturing High Quality” (referenced above), points to the risks of deciding care based on unsubstantiated data. In one example, the report points to a federally funded project in Rhode Island that set out to improve the health of diabetes patients. However, researchers discovered data quality issues due to missing or inaccessible data or wide, inexplicable variations in outcomes.
The Capturing High Quality report concludes that “as the industry moves toward value-based reimbursement — reimbursement based on quality and cost measures — improving the quality of the data used for measurement is imperative.”
In his letter, Andy Slavitt explains plans for an information superhighway in healthcare, saying the focus is on “measures that support hospitals and physicians safely and securely exchanging information.”
In fact: Earlier this month, the federal agency responsible for healthcare technology hosted a webinar to address what the agency called the “important safety topic” of EHR usability. The webinar featured Dr. Andrew Gettinger, Executive Director of the federal Office of Clinical Quality and Safety, and leaders of the Pew Charitable Trusts, which has studied EHR usability. The issues covered in the webinar are summed up on Pew’s website and include this warning:
“Although the United States has invested tens of billions of dollars to encourage providers to adopt electronic health records, many clinicians have found that these systems have poor ‘usability.’ EHRs can put patients at risk of medical error, do little to enhance clinical care, and increase the time clinicians spend documenting patient care. Indeed, one study found that 15 percent of physicians reported that their EHR had caused a potential medication error within the past month.”
Other evidence supports Pew’s findings. Examples include:
- Earlier this year, the Journal of AHIMA reported on survey results indicating widespread problems in accurately matching individuals with their healthcare records. Duplicate records commonly exist, creating greater likelihood of errors in treating people.
- Last year, in a joint letter to the U.S. Department of Health and Human Services, 36 professional associations raised questions about the very security of patient information contained in EHRs. In the letter, the associations raised concerns about poorly functioning EHRs resulting in “medical record errors, inaccurate documentation, lack of interoperability, slow performance, lost patient information, and safety concerns.”
In the case of Medicare, the point of this exercise goes beyond half-truths and pertinent omissions. The issue here is one of medical ethics. If America lacks the technology for standardized patient care based on statistical analysis, then premature demands to move in this direction put the very health of Americans at unnecessary risk. Like the project in Rhode Island, the statistics used to determine patient treatments may be flawed. And yet doctors stand to make more money by playing along – dispensing care according to statistical outcomes that may or may not be valid. All the while, those physicians who buck the system face financial penalties.
In his letter last week, Andy Slavitt extols Medicare for “becoming more open, transparent and responsive (and) committed to paying close attention to the impact of our policies on care delivery.”
Really? Mr. Slavitt, please look Americans in the eye and explain.