Coronary CT angiography (CCTA) made a splash in the mid-2000s with cardiologists and radiologists when it was shown to effectively detect coronary stenosis. Now some advocates are questioning whether the training guidelines are sufficiently rigorous to ensure proficiency. They fear that possible variability in interpretation of exams may diminish CCTA’s potential as a powerful clinical tool.
The multicenter ACCURACY clinical trial helped to ease concerns that the accuracy of 16 multidetector row CCTA occurred only in academic settings by enrolling 83 percent of its patients from nonacademic sites (J Am Coll Cardiol 2008;52:1724-1732). The study showed that CCTA was both a highly accurate diagnostic tool for detecting stenosis at the 50 percent and 70 percent thresholds in patients with chest pains and no known coronary artery disease, and it effectively ruled out obstructive coronary artery stenosis. The authors noted that “with adequate training, any imaging center can perform CCTA procedures with high quality” and that accuracy was achieved with a variety of readers.
But centers and operators involved in trials tend to be a special breed, notes John R. Lesser, MD, director of cardiovascular CT and MRI at the Minneapolis Heart Institute Foundation and president of the Society of Cardiovascular Computed Tomography (SCCT). Physicians often are recruited because they are meticulous; centers often work in high volumes; and the trials themselves are designed to control against the vagaries of real-world settings.
“Physicians need to be trained well to read [a CT exam] properly,” Lesser says. Good training is not just teaching radiologists and cardiologists how to interpret a scan but also how to understand its clinical context and then effectively communicate that information to the referring physician or emergency room personnel. “Oftentimes it is not completely straightforward.”
And then there is the scenario of a good technology in less competent hands. If trained radiologists and cardiologists infrequently use CCTA, then they and their patients may not reap its full benefits. This may be especially exacerbated in cases where physicians met minimal standards during their training.
In late 2012, Lesser floated a question to his peers in a letter (J Cardiovasc Comput Tomogr 2012;6:434-435). Was it time to revisit the CCTA training standards developed in the mid-2000s, at a period when the technology was still in its infancy and the medical community and public both were infatuated with its possibilities? Anecdotally, radiologists and cardiologists at the vanguard of CCTA were now detecting variability in the quality of both the scans and their interpretation in clinical practice.
Lesser tossed out the gauntlet: Perhaps current training standards were not vigorous enough, resulting in variable competencies. “The technique has great potential but if you don’t do it properly, it will not live up to anywhere near that potential,” he says. “It will look like it is not any good.”
U. Joseph Schoepf, MD, director of cardiovascular imaging at the Medical University of South Carolina in Charleston, and a co-author of the American College of Radiology’s (ACR) practice guidelines, shares this concern. If too many practitioners apply CCTA poorly, it will diminish the gains that would manifest as improved outcomes. “If there is no improvement in patient outcomes with the addition of cardiac CT to our diagnostic and therapeutic algorithms, people will wonder if it is worthwhile to approve any kind of reimbursement for this particular test,” he says.
|Responses add up to more than 451 because survey respondents may have received more than 1 method of training. ACC: American College of Cardiology; ACR: American College of Radiology; CT: computed tomography.|
Source: JACC Cardiovasc Imaging 2010;3(9):976-980
Schoepf describes CCTA as a poster child for progress in medical imaging. As a noninvasive alternative to conventional angiography, it appealed to providers and patients both for assessing coronary arteries and evaluating cardiac function. The National Heart and Blood Institute, for instance, paints it as “a painless test that uses an x-ray machine to take clear, detailed pictures of the heart” to patients. The advent of the 64-slice cardiac CT, with its improved performance and reduced effective dose, attracted even more attention and proponents.
But the credentialing criteria for interpreting CCTA set the bar too low, according to Schoepf. “It is too simple to become ‘certified’ to interpret cardiac CT,” he argues. “Yet there are people who feel entitled to interpret such studies.”
The American College of Cardiology Foundation (ACCF), the American Heart Association (AHA) and the American College of Physicians (ACP) recommend three levels of training for a physician to achieve competence in CCTA (J Am Coll Cardiol 2005;46:383-402). Level one, the minimum, calls for four weeks of training with mentored readings of at least 50 angiographic cases. Level two ups the ante with eight weeks of training, at least 50 mentored examinations performed with a minimum of 150 angiographic cases. Level three requires six months of training, 100 mentored exams performed with a minimum of 300 angiographic cases.
The recommendation’s task force stated that only level two and three suffice for independent performance and interpretation of scans. The authors added that training did not guarantee competence or expertise, and emphasized that physicians were responsible for augmenting their skills if needed to build and maintain their expertise.
The ACR requires physicians who already are qualified in general and thoracic CT to undergo training that includes an equivalent to 30 continuing medical education (CME) hours in cardiac anatomy, physiology, pathology, as well as interpreting CCTA results and reviewing 50 CCTA exams in the past three years (J A Coll Radiol 2006;3:677-685). Another option includes documented supervised activities in a center that performs CCTA and interpreting, reporting or a supervised review of 50 or more CCTA studies.
“When the technique first started, it was not known what was required,” Lesser says. The societies’ guidelines were a good-faith effort to ensure minimum standards, but with time some physicians have seen the need to review and fine-tune the standards. “It is time to close any loopholes and make sure we are doing right for people.”
Duration & experience
Lesser and Schoepf acknowledge that evidence supporting the theory of variability in CCTA scan quality and interpretation, while expanding, is still more of a puddle than a pool. What research exists on training effectiveness typically involves small numbers of participants and may have limited generalizability.
To test the adequacy of training criteria in the ACCF/AHA/ACP guidelines, one study tracked three radiologists and a cardiologist with no experience in CCTA through a one-year fellowship (Radiology 2009;251:359-368). Each fellow independently read 50 CCTA test cases at time periods that matched training durations in the guidelines: baseline, four weeks (level one), eight weeks (level two), six months (level three) and one year. An experienced radiologist and cardiologist also scored the same cases for a consensus reading using the same workstation.
All four readers showed improved sensitivity, specificity and diagnostic odds ratios between baseline and one year, with ranges at baseline from 33 to 72 percent, 70 to 94 percent, and 3.8 to 8.1 respectively, and at one year from 66 to 75 percent, 87 to 92 percent and 14.7 to 25.8. Evaluation times decreased from a mean of 20 minutes at baseline to a mean of 13 minutes at one year.
But the fellows’ improved performance was not a steady upward trajectory. Each had at least one drop in performance in the midterm of their training, with two slipping between four weeks and eight weeks and two between eight weeks and six months. The authors suggested the drops at the intermediate stage could be due to readers’ limited experience or to overconfidence. Also while increasing experience improved performance, it may take more than one year to acquire sufficient experience.
Reflecting on those results, Matthew J. Budoff, MD, lead author of the ACCF/AHA/ACP guidelines, noted that the fellows study showed similar training does not necessarily lead to identical performance, especially after initial training, and even experts continue to improve with experience (J Cardiovasc Comput Tomogr 2010;4:186-194).
While crafting the guidelines, Budoff and the other experts concluded that radiologists had a better grasp on the CT scanner and acquisition parameters while cardiologists better understood coronary anatomy and physiology. “They each have an equal amount to learn by training for cardiac CT,” says Budoff, director of cardiac CT at the Los Angeles Biomedical Research Institute at Harbor-University of California Los Angeles Medical Center in Torrance. “They all needed equal exposure but for different reasons.”
Schoepf, who is chair of the ACR Committee on Certification of Advanced Proficiency in Cardiac CT, and colleagues designed a study to evaluate the roles of experience and background on proficiency and improvement. Nine readers whose experience with CCTA ranged from novice to expert were recruited to first read 50 cases to assess their proficiency. The cases were reviewed and reinterpreted by an expert reader with unblinded catheter results, and the nine physicians then interpreted 50 new cases, which were graded to assess the degree of improvement.
All readers improved, but only the most experienced readers performed at the highest level. The readers with a disease-specific reading background but little or no experience with CCTA performed at the lowest levels. The researchers concluded that experience is the strongest determinant of CCTA proficiency and that 50 cases improve proficiency but not at a clinically satisfactory level.
Is it feasible to raise the bar and require a longer time commitment for CCTA training? Training facilities may not have the capacity, cautions Budoff. The standards “are not optimal but the reason is that cardiac CT is not as readily available in every training program yet, either in radiology or cardiology,” he says. “We have to have an intermediate step until cardiac CT becomes more widely available in the teaching environment.”
Bodies such as Residency Review Committees that oversee accreditation and ensure compliance with standards in fellowship programs need to enforce the guideline standards. “If they forced cardiac fellowships to have cardiac CT, all cardiac fellowships would [include] cardiac CT today,” Budoff reasons.
To kick-start dialogue on the issue, Lesser assembled an ad hoc group of radiologists and cardiologists representing academics, private practices and fellows in late 2012 to review existing recommendations and discuss whether the criteria need to be revised. The group expects to present its recommendations in July at the SCCT’s 2013 scientific meeting in Montreal.
Lesser, Schoepf and Budoff agree that CCTA will prove to be an effective clinical tool and a powerful helpmate in the future. The current scrutiny and reassessment of CCTA are part of a natural evolution in medicine.
“You don’t want every new flash in the pan turned into an application that everybody does and offers to their patients,” Schoepf says. “You want a reasonable amount of testing of what works and to what extent it works. Cardiac CT is undergoing that vetting process right now.”