Workstation Vendors Respond to Lower Dose CT Scanning
“If the reconstruction software is capable of processing low-dose scans, then high-dose scans are not required, so the software itself has a direct impact on reduction of patient dose,” says Robert Taylor, PhD, president and CEO of workstation developer TeraRecon.
Most workstation vendors are now seeking to address the need to reconstruct images from lower dose scans, which are inherently noisier. “Even with more difficult data sets, CT workstations give you several tools to help your view,” says Tony DeFrance, MD, medical director of CVCTA Education Center in San Francisco. “With noisier images, we can use a thicker slice [reconstruction], or we can use the multiplanar imaging feature so we get multiple angles if the study has more noise from a lower radiation dose. There’s quite a bit of flexibility to balance those low-dose studies.”
CT technology and postprocessing workstations have advanced very rapidly over the first part of the 21st Century, and the results have been impressive, according to Rebecca Schwartz, MD, interim associate chief of radiology at St. Elizabeth’s Medical Center in Boston. “These advanced software programs have revolutionized our ability to safely take care of our patients,” says Schwartz. “They have allowed us to inject minimal amounts of contrast, use the least possible dose of radiation to be effective and produce imaging studies that can often replace multiple other tests, including invasive tests that used to require day-long or overnight hospital stays just to make a diagnosis that can now be made much more safely.”
Simple yet sophisticated“The difficult, laborious post-processing of the past has been simplified to a remarkable degree,” says William Guy Weigold, MD, director of the cardiac CT program at Washington Medical Center, Washington, D.C. “In the past, I would have to pull up a data set and I’d see a 3D picture of someone’s entire chest,” says Weigold. He would then rotate that image around in space and electronically carve away the patient’s chest, ribs, lungs and blood vessels until he actually got down to the heart and could manually track out all of the coronary vessels. Today, the workstation software performs all that preliminary postprocessing in the background as the system boots up.
Four or five years ago, the steps needed to make reconstructions—that were “significantly inferior” to what Schwartz finds acceptable today—were so numerous that it discouraged widespread adoption of the technology. “So, typically, there might have been one technologist who knew how to use the workstation and the postprocessing would only get done when that tech could find the time to do it.
Schwartz also points out that advances in workstation technology have an impact on budgets. A few years ago, reconstructions could take longer than an hour and were performed by radiologists or cardiologists. “Not only was it an hour of time—it was an hour of very expensive time,” she says. “Now it takes 15 to 30 minutes of relatively inexpensive technologist time to make very high quality reconstructions.”
While workstations uniformly provide clinicians with an opportunity to improve imaging, workflow and patient care, there’s still room for improvement, says Geoffrey Rubin, MD, founder of the annual “Stanford Workstation Face-off” and medical director of the 3D laboratory at Stanford University in California. He says that as vendors have attempted to meet the technology demands of their customers, they’ve modified existing software applications, rather than start from scratch. “I can tell pretty quickly if I’m using an application that dates back to the 1990s,” says Rubin, which usually means that the workstation’s user interface is less than optimal.
Overall, however, advances in CT postprocessing hardware and software have made cardiac CT imaging much more user-friendly than in the past. No doubt, future refinements will improve postprocessing even further.