Rocky Road?
By Dave Yeager
Radiology Today
Vol. 22 No. 4 P. 22
Don’t let data migration and consolidation lead you down the wrong path.
Nobody would argue that migrating and consolidating data archives isn’t a chore. But it’s a chore that isn’t going away. If anything, organization mergers and acquisitions, not to mention aging systems, are increasing the likelihood that a data migration or consolidation is looming in your future. It’s a challenge that many may be unprepared for.
“For the most part, health providers have been a bit naïve in terms of what’s involved in consolidation,” says Don Dennison, a health care IT consultant who specializes in enterprise imaging and radiology IT. “They think of it as more of a data migration problem: ‘I’m going to take this petabyte and that petabyte and put them into a bucket of two petabytes with some software on top.’ But that’s not the hard part. What requires more decision making and effort is consolidating all of the information indices and then standardizing all of that information. Because if the data aren’t consistent and able to be identified, queried, and accessed—accurately and reliably—then it loses a lot of its functional value.”
Maintaining functional value is a key aspect of any consolidation. To do it effectively, organizations need a plan. This holds especially true when images and data outside of medical images are considered. With multiple departments putting smartphone pictures and jpegs in vendor-neutral archives (VNAs) and EHRs, data migration becomes much more complicated.
“The data are already being created all over the place,” says Mike Gray, principal of Gray Consulting. “The minute you stop thinking about an archive problem as being just DICOM images and realize that it’s literally everything, you realize that this is a really big problem.”
Fortunately, there are strategies that can make the process, if not painless, at least less unpleasant.
Focus on Hygiene
Data hygiene, as it’s commonly called, isn’t about microbes, but most people consider it dirty work. Charles Barkey, assistant vice president for IT at West Virginia University (WVU) Medicine in Morgantown, West Virginia, has worked on 14 data migrations since 2018. One strategy that he highly recommends is cleaning data before it is put into a PACS. To avoid having files dumped into WVU’s PACS, the organization built a separate repository for outside studies that don’t have accession numbers.
“If there is anything that I would advise organizations to do—and this isn’t so much for data migration as much as general PACS process— keep your data as clean as possible and don’t just let data get dumped into your PACS,” Barkey says. “We currently have a 16-hospital PACS system that is 100% clean.”
Additionally, Barkey says vendor validation metrics may not match with the facility’s, making it crucial for the facility to ensure that data are being migrated correctly. For example, in a study with 200 images, 199 may be migrated, and the vendor will mark that as a successful migration.
“To me, that’s not a fully successful migration,” Barkey says. “Then we need to go back and validate why that one study was corrupt and whether it can be migrated.” He adds that vendors may consider a 3% failure rate to be acceptable, but when millions of studies are being migrated, the numbers add up. “That’s a pretty big number,” Barkey says. “I would think that would scare anyone in the clinical field.”
Spot checks are another useful strategy. Barkey says identifying migration issues as early as possible saves time in the long run.
“What we’ve learned more than anything is to really review the extract sample data and then continue to review their migration with spot checks, instead of waiting for the final report and looking at analysis of positives that are completed,” he says.
As more archives are consolidated, the problems grow. Organizations often attempt to consolidate data by putting complex rules in place to find studies that have been migrated from multiple repositories. Data quality varies between health systems, however, and this approach may not yield the best results.
“It’s like 10 people dumping a glass of water into a pitcher, and now everyone’s going to drink from it,” Dennison says. “And they’re all trusting that the water in the town’s well is clean.”
Dennison says normalizing data before they go into an archive can save time and aggravation. He notes that this has become easier in recent years because many health systems are adopting Epic EHRs and, often, using Epic’s standard list of orderable procedures. By overwriting a study’s description and other record-level metadata before they go into the archive, it is much easier to create longitudinal patient records and identify similar exam types. Although this requires more work on the front end, it makes the data much more user friendly. In addition, checking data for consistency on a regular basis helps ensure that they remain user friendly.
“Often, people set up the HL7 interface at the beginning, but they don’t run yearly consistency checks,” Dennison says. “They often don’t have anything that’s looking at the exam header for common problems like missing values, invalid characters, truncated fields, or things like that. Understand the full scope of what’s there. Don’t miss the opportunity to take a look and to try to define and improve quality for those records.”
Talk to the Manager
Another factor that can affect the success of consolidation is the vendor. Gray points out that some are more helpful than others. He says help from the vendor is often required.
“There are potential migration problems, due to the nature of some PACS systems storing the original image data and the work products that are created during the interpretation process,” Gray says. “If those are stored in a proprietary format, you could have trouble migrating them. And, if you can’t migrate them, you’re stuck. You’ve moved the original DICOM data, but you need help from the vendor to migrate the presentation state (the window/level setting showing the relevant pathology and relevant mark-ups/overlays) and the key image notes (key images selected by the radiologist).”
Elyse Thomas, IT lead for radiology and cardiology at St. Luke’s University Health System in Pennsylvania, has been involved with multiple consolidations over the past decade. She says vendor engagement is one of the most important aspects of the process. One trait she recommends highly is the ability to make changes as circumstances dictate.
“Be aware that no conversion is perfect. It’s really unheard of in our area that we work in,” Thomas says. “No vendor is 100% perfect, either, but you want to align with a vendor who is willing to help you clean up, if there’s a mess left behind. Choose somebody who’s willing to be a partner with you.”
Consolidated data from aging technologies can also present many challenges. Gray says if an organization is incorporating more “ologies” into their archive, and many are, they need to consider how that will affect users. Do they need to buy another specialty PACS for a specific department? If so, they’ll want to integrate it with their VNA rather than using the archive that comes with the PACS. They also need to consider how they’ll handle non-DICOM files.
Gray says many organizations have been putting non-DICOM images, such as those from dermatology, endoscopy, and smartphones, into their EHRs. This is problematic because many of those images are in proprietary formats. Gray sees this as building the data migration problem of the future, and it’s a growing problem.
“An IT organization should really be thinking about what to do with their VNA after they’ve taken care of their DICOM files,” Gray says. “After DICOM, now what? That is really sticky right now, and that decision really needs to be thought through. When you bought your VNA four years ago, did you think to ask whether it was going to be capable of dealing with non-DICOM files and, if so, how?”
Gray recommends choosing a VNA that uses the Integrating the Healthcare Enterprise’s cross-enterprise document sharing profile.
Make Work Flow
Streamlined workflow is crucial for radiology, but workflow issues may not be readily apparent until an organization looks under the hood and figures out what roles the legacy technology played. Some organizations use departments’ PACS that sit over a shared archive as the workflow intelligence that routes studies. Others use universal viewers on top of a VNA that drives workflow. When doing a consolidation, Dennison says it’s crucial to understand each site’s setup beforehand.
“When you do a consolidation, you have to understand what each of the buckets that you’re getting rid of were doing,” Dennison says. “When people start looking at it closely, they begin to realize how much responsibility they’ve put on specific pieces to do things or enable things or route certain data, such as routing specific data to a night read radiology group. All of that has to be reproduced in the new system.”
The more separated and modular the systems, the more important it is to determine system responsibilities, Dennison says. Not all PACS have the same capabilities, and some don’t do as well at prefetching and routing relevant studies. When routing data among systems creates multiple copies, there needs to be a mechanism to sync all of the systems with caches and databases that are talking to the archive.
“You really have to figure out what is responsible for what and what your triggering events are,” Dennison says. “Is it the receipt of an order? Do you do anything when you receive an HL7 [Admit, Discharge, Transfer] update? Are you going to try to use any information related to upcoming clinical appointments to do anything with the data? You have to go back, and the things you’ve got working in one case, you need to make sure they’re working in all cases.”
The goal is data completeness and consistency. Dennison says there should be a single system that applies rules to determine whether data are valid and complete. There also needs to be consistency in identifiers and naming conventions for studies so the system can find relevant information.
Images are often compressed to save space. This can be done in DICOM standard formats, such as jpeg 2000 and lossless jpeg, or some systems may compress imaging data into proprietary formats. When consolidating multiple legacy systems into a new system, the new system may need to use a single compression format. Too often, he says, people don’t account for the time and technical risks involved with decompression and recompression.
Ultimately, streamlined workflow helps the people who influence clinical care. Thomas says making sure radiologists aren’t affected by a conversion is a top priority.
“Another challenge is getting the reports that are associated with the images,” she says. “Is there an HL7 feed? Are the reports available? Can you attach the reports to those images to make it easier for the radiologists to have everything in one place?”
There is also some debate about how long medical images need to be saved in an archive. Aside from the cost of storage, which seems to be much less of a concern than it once was, there is a question of liability. States have different patient record retention rules, and there is no clear consensus about whether it’s preferable to strictly follow the letter of the law regarding image retention or whether it’s best to save everything, regardless of requirements.
“Still, in our radiology business today, there are not a lot of people who are deleting images from the digital archive,” Thomas says. “Back in the day when you had thousands of films, people wanted to get rid of them because they took up a lot of space. Now, in the digital world, it’s a little bit different.”
Future-Proof Your Archive
One of the core goals of a successful business is to be ready for the next thing so the organization can adjust accordingly. For example, if your organization is considering a new VNA, built-in migration software can save significant time and money, Gray says.
“If you’re trying to migrate from one VNA to another VNA and you don’t have the software built in that allows you to do it yourself, you’re now paying a second time to migrate all of the data you paid to move three years ago into the new VNA,” he notes.
Another looming challenge is AI. As more algorithms come into use—and especially as they qualify for reimbursement from the Centers for Medicare & Medicaid Services—there will be a growing need for data to feed those algorithms. Gray says the data will need to be consolidated in one place, making the VNA the logical place to store them, and the data transfers will also need to be automated.
“As far as sending data manually to an algorithm, we know from 30 years of experience that the more you make the radiologist do, the less he or she is going to do it,” Gray says. Yet automating the transfers will require significant effort.
In addition, health care system mergers are increasing, and organizations will need to be prepared. There is also a significant issue with aging technology, and many organizations will need to modernize.
“I don’t see any of the market forces that would say consolidation will slow down,” Dennison says. “And, quite frankly, there are a lot of health systems that I talk to with very old PACS. They have PACS from vendors who have been bought up twice, and they’re running a version that’s five years old. There’s a lot of equipment that’s not going to survive another five years that you’ve got to do something with, and many are site-level archives.”
For all of these reasons, the key to preventing migration and consolidation headaches is to prepare in advance.
“If you don’t have a plan,” Gray says, “any road will get you there.”
— Dave Yeager is the editor of Radiology Today.