July 2014

Quality Metrics — Forward-Looking Organizations Are Developing Their Own Performance Measures
By David Yeager
Radiology Today
Vol. 15 No. 7 P. 12

A move to value-based care is one of the most highly anticipated changes wrought by the Affordable Care Act, but it’s also one of the most poorly defined terms in health care today. That absence of value’s definition is one reason why value-based care still largely is on the drawing board. Another is that the technology tools to measure value still are being developed.

To date, radiology has been on the periphery of the value-based care movement. While that’s not likely to change soon, eventually it will, so the question becomes what do radiology departments and practices need to do to be ready?

Vijay M. Rao, MD, FACR, a professor and the chair of the department of radiology at Thomas Jefferson University Hospital in Philadelphia, says the sheer number of potential performance measures is daunting. For example, the consulting company The Advisory Board Company lists nearly 300 radiology-specific measures.

Perhaps of most interest to radiology departments and practices, though, is the Centers for Medicare & Medicaid Services’ (CMS) Physician Quality Reporting System (PQRS), which started as a voluntary program but will begin assessing penalties in 2015. The PQRS includes numerous measures for radiology. The CMS also collects some radiology data for its Hospital Outpatient Quality Reporting program, which feeds the publicly available Hospital Compare website.
With all of these competing guidelines, it’s hard to know which metrics to track.

Identifying Measures
“It’s not an easy topic,” says Rao, who spoke about quality metrics in May at the Jornada Paulista de Radiologia Conference in São Paulo, Brazil. “Quality metrics have to be picked very carefully by a radiology group or a department so they can accurately measure them and then make a difference at the end of the day.”

Teri Yates, founder and principal consultant of Accountable Radiology Advisors, who advises several companies in the health care industry on radiology-related matters, agrees that developing quality metrics will be a crucial task for radiology groups. Although accountable care organizations (ACOs) currently are prioritizing high-value targets such as diabetes and heart disease, and many ACOs still are paying radiology on a fee-for-service basis, she says now is the time to start proactively tracking quality metrics.

To do that, health care organizations need to assess the data and technical resources that are available to them. They also need to be specific about which clinical questions they want answered. In the era of Big Data, it’s easy to focus on the trees and miss the forest.

“One of the big challenges that I see as we try to move down this path of analytics is we have so much data available to us. [But] do we have anybody who works for us who understands how to define quality, find the data in the systems, analyze it, and then knows what to do with it?” Yates says. “Organizations need a very strong vision at the beginning of what questions they are trying to answer and what good performance really looks like.”

Rao says studying the life cycle of a radiology image can help identify a radiology department’s or practice’s most important metrics. She says that at every step of that cycle, there are opportunities to develop specific quality measures. Important areas of interest can be divided into “buckets” to make the process more manageable. She says measures of physician competence, patient satisfaction, operational efficiency, patient safety, and study appropriateness can provide valuable information, and she recommends choosing two or three measures from each bucket.

Appropriateness Criteria
Rao believes appropriateness criteria in particular will become highly important as value-based care works its way into medical practice. She says identifying which studies offer the best clinical value will put radiologists in a position to significantly affect care quality, but that’s easier said than done. “Tracking some of these measures is really difficult. We need to develop tools,” she says. “We need to identify which key performance indicators we want to track and then use the tools to collect the data, manage the data, and drive improvement.”

There are four important areas that Yates believes represent quality in radiology. The first is exam appropriateness. Yates says clinical decision support can help determine exam appropriateness, but radiology practices and departments that don’t use decision-support tools also can track variations in exam ordering among referring providers that suggest an atypical pattern. Additionally, they can look at more specific parameters, such as the number of patients who are having multiple CT scans in a short period of time without any clear clinical indication.

Yates says safety metrics such as radiation dose and adverse events also are important as well as measurements of report and interpretation accuracy. Peer review and medical outcomes audits, such as those used in mammography, are useful for measuring accuracy. Finally, she says communication metrics, such as turnaround time, critical results reporting, and consultations, should be tracked, too.

Yates adds that making the data user friendly is equally important. She says downloading information to a spreadsheet, which is sometimes done, isn’t the best way to communicate information. Commonly used scorecards, or thumbnail sketches of relevant data points, are a better option—but not optimum because they’re drawn from historical data. Dashboards that provide real-time feedback on items such as turnaround time, for example, are the most effective way to monitor quality metrics. Yates believes dashboards will become increasingly popular tools.

Normalizing Data
With data being housed in various systems across the medical enterprise, normalizing data and integrating clinical systems is a must for developing and tracking quality measures. Unfortunately, that’s often not the case. Gary Wendt, MD, MBA, a professor of radiology and the vice chair of informatics at the University of Wisconsin-Madison says effective data integration should be on the mind of anyone who’s in the market for a new PACS. “Can your vendor either supply you with the tools to seamlessly integrate all of the data you need or will they actually develop something that’s tailored to you?” he says. “If they at least provide you with the tools, you can develop something on your own. If the tools aren’t even available, you’re sort of out of luck.”

Wendt says facilities’ lack of system integration hampers the development of quality metrics in many ways. For example, most peer review systems aren’t linked to a PACS, which means users need another workstation, another login, and another system to collect data. He says radiology departments and groups need a quality assurance and quality improvement system within their PACS to access quality metrics.

The University of Wisconsin uses a university-developed program that allows automatic quality assurance of certain exam types and protocols. The program also allows the radiology department to perform functions, including the following:

• collecting data as protocols are updated to validate the protocols;

• performing quality assurance reviews on technologists to ensure that they’re correctly performing exams; and

• evaluating turnaround times on resident preliminary reads.

Without a single interface, those types of functions would be too cumbersome to be useful.

 “I don’t think you can start collecting that much data by using separate systems,” Wendt says. “If you have to log in to a separate system to do every different function, it’s practically unworkable.”

Tracking Complications
Allen J. Rovner, MD, a radiologist at Aultman Hospital in Canton, Ohio, agrees with Wendt. Aultman Hospital is a 808-bed facility and a level 2 trauma center that uses Montage Search & Analytics to track various quality metrics. He says that prior to implementing Montage, staff already were tracking standard quality measures such as turnaround times for standard reports, critical results reports, certain types of procedures, and technologists arriving when called in. They also were tracking complications for certain procedures such as headaches following myelograms, pneumothorax following thorocentesis, and bleeding following a biopsy.

Now, Rovner says, they’re beginning to use the data-mining software to take a deeper look at certain clinical details. One project currently under way is tracking hip fractures that aren’t identified on initial X-rays. Rovner says the graying population in the region Aultman Hospital serves is driving up the incidence of hip fractures. Because many people are brought in late in the evening, it can be difficult to determine how to manage a patient with hip pain and a negative X-ray, especially if there isn’t an MRI technologist on site.

“So what’s the next step? Well, you can keep them overnight in the ER, but the ER’s way too busy, and they don’t have the time or the beds or the manpower to monitor people,” Rovner says. “We’ve decided that we’re going to look at the issue of what the risk is for patients who come in and have a negative X-ray and what our risk is as a health care provider. Sometimes they have an undisplaced fracture and they can walk on it, and they are, unfortunately, triaged home. That can be a big problem.”

Aultman Hospital also is looking at utilization criteria. Rovner says the current trends to lower radiation dose and eliminate unnecessary studies are here to stay, and the Affordable Care Act will accelerate these trends. Aultman Hospital is working with its emergency department physicians to get a better handle on appropriate study utilization. Rovner says data-mining software makes it much easier to follow patients and monitor outcomes.

Additionally, Aultman Hospital has been looking for discrepancies in reports and soon will begin looking for instances of pulmonary emboli in CT angiograms. Aultman physicians have noticed a significant number of pulmonary embolisms in younger patients who had CT angiograms and wanted to keep tabs on the trend. Rovner says this is one example of how analytics can improve the quality of care, but he believes the possibilities for better care are vast.

Tear Down the Silos
Rovner says advanced analytics software allows physicians to do longitudinal studies with a mouse click. On the horizon, he sees the possibility of improving overall population health but doing that requires cooperative data mining. Sharing data among institutions and organization would allow access to a large enough patient population to accurately assess outcomes and develop relevant guidelines.

Rovner says cooperative data mining wouldn’t be technically challenging, but current privacy regulations make it difficult. He says physicians currently are operating in data silos to patients’ detriment. To improve care, Rovner says it would be helpful for physicians to know how other physicians are treating patients; the more complex the medical condition, the more important data sharing becomes.

“We’ve reached an age where the technology is there—the software, the computer power, the networking, it’s all available. We should be able to aggregate our clinical experience,” Rovner says. “I really think it’s time for medicine to get into the data world.”

Increased data sharing also may help radiology prove its value in a more measurable way. Because patient care is highly complex, it’s difficult to determine how radiology affects overall outcomes. The most common strategy is to look at the cost of an exam, but Yates points out that noninvasive radiology exams can eliminate the need for more invasive—and typically expensive—procedures. She says to begin quantifying radiology’s value, performance indicators will be important, but people in the health care system also will need to take a look at how radiology affects the care cycle.

Yates says she has advised one of her clients, Medicalis Corporation, to take a closer look at this question, and the company has developed a few reports to track certain measures. One report focuses on radiology’s impact on emergency operations and highlights variations in ordering patterns. Another tracks turnaround times on studies required for patient discharge. Although these sorts of efforts are in the early stages, Yates says more work must be done in this area to put radiology in its proper health care context and improve overall care.

“Clearly, radiology impacts what happens with the rest of the patient experience, but determining how it affects the care cycle is a very advanced frontier,” Yates says. “It’s very complex, but it’s the right thing to do.”

— David Yeager is a freelance writer and editor based in Royersford, Pennsylvania. He primarily writes about imaging IT for Radiology Today.