August 2014
Business Analytics — The Tools for Measuring Performance Are Evolving
By Timothy W. Boden, CMPE
Radiology Today
Vol. 15 No. 8 P. 14
Benchmarking a business’ processes and performance metrics to industry best practices rose to prominence in numerous industries in the late 1980s. It’s no coincidence that benchmarking took off shortly after computers got personal and landed on Americans’ desktops. Accessible computing put larger amounts of data in the hands of business owners and managers.
Medical practices grew larger and more complex around the same time. Physicians and their managers pursued benchmarking in an effort to measure performance more objectively. Rapidly, the industry’s appetite for data mushroomed. Professional associations had been collecting data for some time. The American Medical Association (AMA) and the Medical Group Management Association (MGMA) began tracking membership information decades earlier.
By the late 1990s, physicians, practice managers, and hospital administrators awaited the arrival of their favorite annual survey reports with almost Christmas Eve–like anticipation. Several credible reports provided a variety of benchmarking data that helped take some of the guesswork out of answering the question “How are we doing?” Among these reports were the following:
• The MGMA collected voluntarily submitted data (usually from members) regarding operating costs, revenue, provider productivity, and compensation.
• The National Society of Certified Healthcare Business Consultants (formerly the National Association of Healthcare Consultants and the Society of Medical Dental Management Consultants) collected similar statistical information primarily from its professional-consultant community.
• The AMA collected data and published reports that focused especially on physician demographics, specialty distribution, and practice settings.
• Several private companies conducted surveys and published the results in various specialized reports as well. For example, The Health Care Group, a consulting group based in Plymouth Meeting, Pennsylvania, offered reports covering support staff salaries, physician starting salaries, and medical practice goodwill valuations. Similarly, Merritt Hawkins and Jackson & Coker, two large recruiting firms, each offered survey reports with data on physician productivity, compensation, and recruitment packages.
The data in most of these reports came by way of formal surveys with numbers voluntarily submitted by participants who were usually rewarded with free copies of the final summary reports. The time-consuming processes resulted in printed reports based on data that were months old. The MGMA’s annual routine found its survey operations department mailing out survey forms near the beginning of the year, with due dates near the end of the first quarter. The printed reports usually appeared just in time for the association’s annual October meeting, with titles such as Physician Compensation and Production Survey: 2012 Report Based on 2011 Data.
While the major surveys reported useful data for most medical specialties and practice settings, physicians and administrators increasingly wanted more specialty-specific reports, especially those medical specialties that require organizational structures, business operations, and practice settings differing greatly from the typical doctor’s office. Staffing, revenue, and overhead benchmarks for primary care physicians or surgeons have almost no relevance for a radiology group. It seemed like only the few radiology-specialty pages in those voluminous reports had anything worth studying.
Specialty-Specific Data
Recognizing the growing need for specialty-specific data, both AHRA and the Radiology Business Management Association (RBMA) recently entered the benchmarking-statistics market. AHRA’s online product, AHRAdatalynx, presented data from 2006, 2009, and 2011 for users to compare their performance in key areas such as productivity and utilization, compensation and benefits, equipment and usage, financials, turnaround time, and referring physicians. The society is working on a new data program that it hopes to make available in the near future.
RBMA has offered benchmarking survey data reports to its members for 10 years, according to Executive Director Michael Mabry. But the association moved the process online when it introduced DataMAXX in 2012. DataMAXX brought improved metrics with instant access for radiology practice managers, who previously had to wait until the end of each year for printed survey reports.
Live, online data allow DataMAXX to produce customizable reports that help users drill down to benchmark against their peers in similar-sized groups in comparable practice setting in the same geographical region. The interactive process also allows RBMA to test data entered by users and flag questionable statistics.
DataMAXX works on a subscription basis: Members who contribute their data have free basic access to the portal and its reporting capabilities covering eight key indicators in accounts receivable (A/R) management performance and eight more indicators for imaging center performance. A/R management indicators include the following:
• adjusted collection percentage;
• days charges in accounts receivable;
• total write-offs as a percentage of gross charges;
• total write-offs as a percentage of adjusted charges;
• bad debt recovery as a percentage of collection agency write-offs;
• billing/collection expense percentage;
• billing/collection cost per procedure; and
• accounts receivable aging percentage over 120 days.
Imaging center measures include procedure productivity indicators and expense management performance numbers, as follow:
• number of CPT codes billed per imaging machine;
• total relative value units (RVUs) per imaging machine;
• number of CPT codes billed per full-time equivalent technologist;
• total RVUs per full-time equivalent technologist;
• imaging center operating expense as a percentage of total imaging center expense;
• imaging center operating expense per CPT code billed;
• imaging center operating expense per total RVU; and
• imaging center marketing expense as a percentage of imaging center revenue.
Basic reporting allows users to compare their practices with the 25th, 50th, and 75th percentiles of the database. Deeper analysis and reporting, including historical trend analysis, requires a premium subscription. Practices that have contributed their data can get premium access for $500 per year. (Noncontributing practices can acquire basic access for $500 per year and premium access for $1,000 per year.) To accommodate its members who operate management services for more than one practice, RBMA has developed a vendor portal that allows larger data uploads for multiple practices and reporting from a management services perspective.
Mabry emphasizes that DataMAXX 2.0 offers a more user-friendly interface than its first offering, and the association continues to refine the product to improve it even more. “This is the era of Big Data,” Mabry says. “With DataMAXX, practice managers have immediate access to live data sets with relevant information to help them improve efficiency. Our goal is to continue to provide financial and operational benchmarks that will propel the profession forward.”
He points to the recently expanded compensation module with 66 different personnel types. It provides salary and benefit information for all major employee positions in the typical radiology practice. Scheduled to roll out this fall, a DataMAXX productivity module will allow radiologists and technologists to compare their work RVU output to that of their peers.
Last year, RBMA’s user base included 109 subscribers, about 40 of which had contributed their practice data. At midyear, this year’s numbers were tracking toward a similar size. The small sample sizes clearly impose some limitations on the statistics; to assure a statistically useful result, DataMAXX won’t report on any sample fewer than six.
You can find larger numbers in MGMA’s survey products. The latest report has data representing 857 diagnostic radiology providers and 207 interventionalists. But since radiology is just one segment in its survey that encompasses 60,000 providers working in more than 3,000 groups of all specialties, the reports don’t include many of the radiology-specific indicators. You won’t find data on turnaround times, for example.
But for researching and benchmarking data on general overhead costs and provider productivity and compensation, the larger samples can provide reliable numbers. Practices participating in the surveys have complete access to survey data for free, which includes unrestricted use of MGMA’s DataDive software.
DataDive represents the association’s latest iteration of its almost 90-year-old data collection and reporting services. MGMA’s annual surveys on practice costs, provider productivity, and compensation have become the industry standard across all medical specialties. The paperbound reports commanded at least six or seven inches of bookshelf each year and required at least six months to compile and print.
Now, with directly uploaded data and online, customizable reporting capabilities, users have much more timely access to large amounts of benchmarking data. MGMA has shortened its time to market for survey data by five months. The latest DataDive product line includes mobile-friendly apps that allow physicians and managers to access survey data at any time from any place with Web access. For those times when connectivity is limited, DataDive Desktop allows users to download information to their laptops or desktop computers to analyze and report on while offline.
Users can upload data with Excel spreadsheet forms or they can enter data directly using a user-friendly interface that Todd Everson, MGMA vice president of data solutions and consulting, refers to as the TurboTax approach. The tabbed interface guides the user through the questionnaire while checking for errors along the way.
Plans are under way for creating a user’s online survey home where he or she can store demographic and other information and eliminate the need to reenter the data. MGMA also is in conversations with several system vendors to create interfaces to allow automatic data uploads directly from practice management and accounting software.
Fully Leveraged Data
Collecting, processing, and analyzing massive amounts of data can feel daunting, to say the least, but such efforts can yield a rich information lode for those willing to drill down for it. Ohio-based Lucid Radiology Solutions’ CIO Ron Hosenfeld enthusiastically describes his data-mining operation. Eight years ago, Hosenfeld cofounded Lucid Radiology Solutions as a spin-off from Riverside Radiology Associates, a 90-physician private radiology group serving 28 hospitals, 30 imaging centers, and a number ofseveral orthopedic groups across Ohio.
Hosenfeld came to Riverside with a background in mechanical engineering and a career in the Air Force working with satellite imaging and encrypted data. A friend invited him to the world of radiology where “the images are a thousand times less complex, and the clientele is a thousand times more demanding.”
The radiology group reads more than 1.6 million studies per year in settings that cover nearly all of the practice spectrum. The sheer size of the operation generates mountains of operational, financial, and clinical data. Recognizing its need to understand and respond to massive data, Hosenfeld took the lead in writing software he refers to as “collectors.” The programs scour data from all the organization’s RIS user applications and store them in relational databases.
By looking at the PACS, billing, financial, and reporting systems, the IT department can track radiology data from all of the facilities Riverside operates or serves. The system stores key indicators, such as when a radiologist opens and closes each exam and when he or she starts dictating. The level of detail allows highly accurate insights into each client site. The practice can do reliable predictive modeling for patient volumes and physician demand and staff appropriately. Retrospectively, it can analyze staffing and measure performance.
Loss of Averages
“No one really cares much about average anymore,” Hosenfeld says, “We’ve done a good job lowering average turnaround time for radiology interpretation. Everyone today would rather identify and address the outliers.”
Therefore, Lucid’s RadAssist software provides a dashboard interface that presents reports and graphs showing distribution rather than averages. The user can click on an outlier data point in the on-screen graph and see pertinent data behind the outlier’s position on the chart. Hosenfeld’s team has focused on ease of use and immediate results for drilling down into the data. This kind of drilling down helps the user investigate and quickly identify underlying issues that may contribute to substandard productivity. This lets Riverside Radiology provide clients with objective data for evaluating turnaround times, from routine to critical results reporting. It takes much of the emotional finger pointing out of the equation.
Objective, scientific statistics become actionable data, information that can be used to adjust and improve any operational process. When a client hospital complained about imaging turnaround for its emergency department, the radiologists were puzzled. They hadn’t perceived any delay in their exams and reporting, and their worklists were clear. Further investigation showed that at the late-evening shift change, the hospital’s CT staff had been batching their work, thereby inducing a false 15- to 20-minute delay in the flow. The bottleneck turned out to be a behavioral issue in the department, a problem easily addressed by the hospital administration. The radiologists were cleared of wrongdoing by thorough data analysis.
Riverside brings a whole new level of accountability to the clients it serves. “It means you have to deliver on what you promised,” Hosenfeld notes. But it creates a closer partnership between the radiologists and the health systems where they work. Clear, thorough, objective data changes the whole conversation between radiologists and hospitals when it comes to contracting and managing the relationships. It helps both sides avoid trying to fix the blame on someone and instead focuses on fixing the problem.
Because Riverside serves so many facilities across Ohio, it has learned to leverage data even further. Lucid has developed internal benchmarking data that give physicians and administrators a good idea of what to expect in almost any situation. They can parse performance and efficiency data for comparable settings, from a rural 100-bed hospital to an urban medical center with more than 1,000 beds. These data help Riverside deliver on its philosophy to provide the same level of service based on clinical needs, regardless of the setting. So a high-priority stroke patient gets the same fast turnaround time whether he presented at a major medical center or at a 10-bed remote facility.
The large database allows Lucid to analyze exam types and develop their own relative valuation of various procedures. It turns out that when you consider clinical staff involvement beyond the basic exam time—communicating findings with referrers, patients, and families, for example—the Centers for Medicare & Medicaid Services’ relative value system doesn’t always reflect reality. By its internal estimation, chest X-rays are somewhat undervalued in the resource-based relative value scale, and CT brain scans may be overvalued. Data such as these help physicians and managers understand their business more thoroughly.
Hosenfeld and company continue to scour their IT systems for better ways to produce reports with actionable data. Such objective data, according to Hosenfeld, help foster real collaboration between radiologists and the hospitals, imaging centers, and practices they serve. Transparency and accountability at this level drive their pursuit of quality care and excellent service.
— Timothy W. Boden, CMPE, has spent 27 years in medical practice management as an administrator, consultant, journalist, and speaker. He resides in Starkville, Mississippi.