January 2012
IR Benchmarking — Keeping Up With the Dr Joneses
By David Yeager
Radiology Today
Vol. 13 No. 1 P. 34
Software helps interventional radiologists share anonymized practice data to assess performance.
Historically, comparing one medical practice to another has been difficult. The number of physicians, patient volume, and services offered provide only hints about how a practice is performing. Geographic and socioeconomic factors sometimes add a bit of clarity, but the overall picture often remains ill defined.
As for objective measures that can refine a practice’s quality assessment, data on patient outcomes and complication rates are useful but are more valuable when data from other practices are available for comparison. To truly measure the quality of their practices, physicians seek quantitative measures that can be compared with a broad cross-section of practices. Measuring oneself against one’s neighbors is a time-tested evaluation method.
Without a large database of patient encounters, it is difficult to get a sense of what is above or below average. Because there hasn’t been an abundance of this type of information available, individual physicians sometimes choose to track outcomes and complication rates on their own and use the data for self-improvement.
Anthony Eclavea, MD, a vascular and interventional radiologist at Radiology Associates of Appleton in Wisconsin, is one of those physicians. When he first began tracking his data, Eclavea created his own Excel spreadsheets. The search functions were rudimentary, but he could at least track his own progress. However, he was in the dark about what other practices were doing.
“When I had to discuss a complication and justify or not justify whether it was within the realm of care, I had to personally do literature searches to see what data was out there as far as benchmarks of what’s in the literature numbers,” says Eclavea. “But there was no comparison across the board that I could just simply say, ‘Hey, this is what the national average is reported in this national database.’”
Safety in Numbers
Eclavea says data tracking and benchmarking are important for all medical practices but, as with any subspecialty, interventional radiology (IR) practices have quirks that can make data gathering a challenge. Because IR procedures can vary greatly and conditions during a procedure can change dramatically, comparing a single practice or facility to another doesn’t necessarily provide much insight. He says there are pronounced differences between private practices and large academic centers as well as between practices that have small (one to three) and large (six to 10) contingents of interventional radiologists. While larger practices and academic centers are more likely to have streamlined processes that potentially improve their outcomes and lower their complication rates, they’re also more likely to deal with emergency and trauma cases with more negative outcomes and complications. Combining the data from these varied types of practices in a national benchmark database can help to statistically even out fluctuations and produce more accurate numbers.
In addition to aiding quality assurance, benchmark data can help a practice determine if it is ahead of or behind the industry curve. Keeping track of national trends can allow an IR practice to see if it’s doing a lot more or a lot less of a particular procedure. The variance may be an anomaly, may indicate reason for concern, or could simply reflect the practice’s market niche. Whether these scenarios require action depends on the particular needs of the practice.
“But, if you don’t have that data in front of you, then there’s no way you can make an intelligent decision on how to go forward with your practice and how you provide care,” says Eclavea.
Steady advances in medical informatics have taken objective measures of this nature from possibility to practice. Last year, HI-IQ, a software platform that was designed by volunteers from the Society of Interventional Radiology (SIR) and is endorsed by SIR, released a peer benchmarking application. The company already offered tracking tools for IR practices. HI-IQ’s database was populated by its customers, who have the option to contribute to the pool of cases. HI-IQ retrieves the data automatically and anonymizes it by removing all patient- and site-specific information.
Although the peer benchmarking database was released in 2010, it now covers cases from 2005 to 2011. Customers have access to four types of visual report:
• top 10 services by frequency for the site;
• top 10 services by frequency from the peer benchmarking database;
• fluoroscopy dosage analysis, which compares the site’s minimum, maximum, 25th percentile, and 75th percentile dose ranges to the peer benchmarking database; and
• service volume trends, which illustrates trends in service at the site and within the peer benchmarking database.
So far, HI-IQ has performed two “data harvests” from customers, which netted roughly 100,000 individual patient encounters across roughly a dozen institutions. Kevin Lauzon, RT(R)(CT), CIIP, software design director for Conexsys, distributors of HI-IQ, says a third data harvest is planned for early 2012.
Practice Improvement
Reports such as these can help a practice improve its quality and efficiency. If the practice decides to look at all patients who had more than 30 minutes of fluoroscopy time, it can determine if any of its radiologists are using higher-than-average doses. If that’s the case, the physician (or physicians) can be educated about best practices for reducing patients’ radiation exposure. Or an individual procedure, such as an iliac stent procedure, can be selected, and the facility’s radiation dose can be compared with the national average.
The reports can also be used to explore areas of potential growth. Lauzon says rising or falling trends may signal new opportunities. For example, if a facility experiences a drop in uterine artery fibroid embolization procedures, it may be able to develop a plan to address that situation.
“If they see that trend on the decline, that may be a marketing opportunity for them to target women’s healthcare providers and market themselves as providing that service,” says Lauzon. “By the same token, they can see that, perhaps, they’re doing a lot of volume for venous access. If that’s the case, maybe that’s enough volume to warrant opening up an off-site venous access clinic.”
Eclavea first started using HI-IQ as a fellow at Indiana University School of Medicine. Although the benchmark capability wasn’t available at that time, its tracking capability allowed him to leave Indiana with an electronic and printed record of every procedure he performed during his fellowship. That information, in turn, allowed him to get credentialed at other institutions. The record was detailed to the level of how frequently he had used a particular catheter on a particular artery, how often he had treated a particular type of lesion, and how often he had treated a particular type of tumor. Later, as the chief of radiology at Dwight D. Eisenhower Army Medical Center, he was able to bring in HI-IQ. By documenting the type and volume of procedures at the center, he was able to demonstrate a need for state-of-the-art equipment.
Eclavea has also used HI-IQ in his current practice to secure credentialing for a nurse practitioner with a hospital system that had no previous experience with either IR procedures or nurse practitioners in the subspecialty. In addition, he says the ability to track individual volumes, types of procedures, and individual complication rates is critical to effective quality assurance. While providing a relative sense of objectivity in comparison to other practices, the benchmarking data allow his practice to target their resources and expand services more efficiently.
“You have to always reflect and look at your numbers and always adjust because life is dynamic, just as medical care is dynamic, and if you’re not adjusting, you’re falling behind,” says Eclavea. “But to make an educated decision, you have to be knowledgeable, and that’s based on data evaluation and benchmark comparison.”
Expanding the Project
After the next data harvest, HI-IQ will release a new set of reports. There are currently no plans to add any new types of report, but Lauzon says the company will conduct a focus group at SIR 2012 to examine the issue. He says new reports are not difficult to develop, but the market will decide which data are most useful.
Although HI-IQ provides a framework on which to build, there is still much work to do. While a national IR database would be beneficial in many ways, Eclavea says there are still some hurdles that need to be overcome to increase participation. One is the standardization of terms. Although initiatives such as the IR RadLex Project, a joint venture of RSNA and SIR, have aimed at standardizing IR radiology terms, there is still a significant amount of variation among clinicians.
Another stumbling block is data entry. Many people still use keyboards to enter data and, depending on the level of integration within their systems, the effort involved in providing data to a third party may be high. Lowering that barrier, especially with current technology such as voice recognition, may encourage data sharing.
It’s also important to make complication rates more transparent. Eclavea says many clinicians are reluctant to share complication data, even if they’re anonymized. However, there will be a certain number of complications associated with any medical procedure, no matter how careful or skilled a clinician may be. The more procedures a doctor performs, especially if those procedures are complex, the higher the possibility of complications. Eclavea says transparency regarding complications is, most likely, the biggest obstacle to participation. Unfortunately, it’s also the most useful data for improving the quality of care.
“I think it’s important that everybody, collectively, can see, across the United States, ‘Hey, this is what we’re doing in the United States, and this is data that is standardized and collected,’” says Eclavea. “That’s how other societies have been doing it for decades, and I think it’s important that interventional radiology practices get to that point.”
An industrywide database may also help, down the road, with cost reform and reimbursement. Eclavea points out that insurance companies are bundling an increasing number of medical procedures when they reimburse for them, essentially reducing their payment per procedure. Requiring physicians to provide data on quality measures to receive maximum reimbursement could be another way to control payments and monitor quality.
“At some point in time, you never know, Medicaid, Medicare may require that you have quality assurance initiatives that are already in effect and that show that you’re doing a good job,” says Eclavea. “Well, the benchmarks will help you show what the averages are across the nation and, therefore, if your data, relative to the benchmarks, are within the standard range, that’s good enough. But it’s even more powerful to say, ‘My complication rate is lower than the national average.’ How do I know? Because I track it, and I can compare it. And so those are the numbers that are helpful.”
Whether that comes to pass remains to be seen, but it probably wouldn’t be a complete shock to physicians if it did. In the meantime, IR practices can improve their quality of care by tracking their outcomes and complication rates. Sharing that data with their peers would benefit the entire subspecialty.
— David Yeager is a freelance writer and editor based in Royersford, Pennsylvania.