Editor’s Note: This article by Caitlin Wilson was recently published in Radiology Business Journal and is reprinted here with permission.
Peer review is a method doctors and health researchers use to hold the work of their whole industries accountable, including within the field of diagnostic radiology. But most interventional radiology practices don’t have similar standardized processes with which to verify work among radiologists.
In that absence, the interventional radiology department at UMass Memorial Medical Center was looking for a way to mimic its existing monthly morbidity and mortality meetings more frequently and formally.
“During those discussions, the section chief of interventional radiology [Stephen Wicky Van Doyer, MD] said ‘Maybe we could benefit from more formal type review, a more systematic review of our procedures. Maybe getting more of them in and also having them randomly chosen.’ That was the initial impetus behind it,” said UMass Interventional Radiology Vice Chair of Quality Steven Baccei, MD.
Baccei and others from the department started creating a program to allow for intradepartmental reviews of each other’s cases. The program, known as Conserus, was modeled on the RADPEER system from the American College of Radiology (ACR) and then written to work within the existing radiological review system at UMass.
Every week, vascular interventional and neurointerventional radiologists are assigned three colleagues’ cases to review. They answer a three-step questionnaire and rank their agreement with the findings/techniques/reporting of the cases as “yes, completely,” “yes, partially,” “no,” “unsure, not enough information,” or “not applicable.”
Any case that doesn’t receive a “yes, completely” agreement across the board is sent to an Interventional Radiology Quality Committee module, which ranks its agreement or disagreement with the peer review.
Baccei and his colleagues published a paper in the Journal of the American College of Radiology in July, outlining their review process’s goals and its preliminary results.
“A successful peer review system must mitigate bias, reassure the reviewer and the reviewed physician that the process is not punitive, and encourage a collegial atmosphere that mutually benefits all parties involved,” the authors wrote.
According to the paper’s overview of the program’s pilot period, there were 126 cases reviewed between September 2015 and January 2016. Of those, 34 ended up with reviews that included “yes, partially” or “no” answers, especially in regards to recording the procedures.
Baccei said questions were changed and more people became involved during the test period as they tried to make the interventional radiology questionnaire fit into the existing radiology review software framework.
“We did phase it in over time. … We do intend on moving this into the entire department for all procedures [eventually],” he said.
They had to adjust the way the system handed out cases for review — some unusual cases weren’t being assigned (and therefore not reviewed) because the computer’s randomizer favored high-volume procedures such as PICC line placements.
And there could be more updates coming. The ACR recommends the diagnostic RADPEER scoring process includes two sub-answers for all of the non-“yes, completely” review responses, which Baccei said they might add to their own review process: “likely to be clinically significant” and “unlikely to be clinically significant.”
Baccei said the process has been “positive” and has revealed previously unseen trends within the department, opening the door for possible improvements such as less variability among case reports and discussions about imaging quality.
“It has really fostered a review of existing policies, and people as a result have been more aware of them formally. And we’ve reviewed some of the policies related to interventional radiology as a result of this,” Baccei said.
The ultimate beneficiaries of this new practice are patients at UMass, according to Baccei. Having better-informed and better-prepared radiologists means their procedures will be more successful, he said.
“Having proceduralists talking about these things, reviewing cases systematically… ultimately what we’re hoping for is that patients will benefit and their procedures will go [more smoothly],” Baccei said.
If you are looking for a solution, we hope you check out Conserus Workflow Intelligence ™, McKesson’s workflow rules engine platform that was designed with diagnostic imaging in mind.