A new meta-analysis has confirmed the effectiveness of technology-enhanced medical simulation in clinical training—and the study’s lead author, David A. Cook, MD, a medical-education specialist at Mayo Clinic in Rochester, Minn., suggested in an interview that the time has come to stop proving the obvious. The analysis was published Sept. 7 in the Journal of the American Medical Association.
The authors searched the literature and found that clinicians and medical students trained with simulation devices show markedly better knowledge, skills and behaviors at the bedside than their peers who do not receive such training.
Cook led the review and meta-analysis to quantitatively summarize the outcomes of studies looking at technology-enhanced simulation training for “health professions learners in comparison with no intervention,” according to JAMA. Cook and colleagues identified 609 eligible studies enrolling more than 35,000 physicians, medical students, nurses, dentists and other healthcare professionals. Of the examined studies, 137 were randomized, 67 were non-randomized studies with two or more groups and 405 used a single-group, pretest-posttest design.
“You would think it’s self-evident that simulation would improve skills and behaviors,” Cook told Healthcare Technology Management. “Practice makes perfect; why would you even need to study the question? So the [proliferation] of studies looking at this question sort of makes you wonder whether people are either asking self-evident questions or just looking to see what other people have done. I can’t answer that question; it would just be speculation. But it does seem a bit surprising to me that it would take 609 research studies to answer this question.
“It’s time for the field to move on to other questions. Rather than asking whether or not this type of training works, we need to start looking at what makes it work and how can we make it work better.”
The authors defined technology broadly, examining studies not only of computerized mannequins and other high-tech practice gadgets but also of plastic models, animal parts and human cadavers.
In fact, more than a few resourceful instructors used low-tech approaches to get the best bang for their buck, Cook noted. “We were sometimes astounded by the creativity,” he said. “Some investigators came up with very inexpensive simulators that emulated live patients with quite a bit of sophistication.” He cited as examples papaya fruits used to emulate a woman’s uterus for routine gynecologic care and an olive embedded in a chicken breast to train ultrasound-guided cyst aspiration. “A papaya is going to be a lot cheaper than some of the virtual-reality tools that are available. One thing that leapt out at us is that technology is just a way to extend what we could normally do. It doesn’t necessarily need to be expensive.”
According to Advanced Initiatives in Medical Simulation (AIMS), a trade group, the U.S. market for medical-simulation devices is small—approximately $105 million a year. A fact sheet downloadable from the group’s website said about two-thirds of this is for mannequins; the remaining third is for “everything else.”
The fact sheet also mentioned a 2005 survey that found 86 percent of simulation-center directors dissatisfied with the equipment and technology they were using.
The AIMS website gives an example of medical simulation reducing errors in hospitals: “Rhode Island Hospital and its Hasbro Children’s Hospital have been funded participants in a U.S. Department of Defense project, MedTeams, to transfer lessons learned in army aviation to medical teams in the Andrew F. Anderson Emergency Center. This multicenter military and civilian project demonstrated the benefits of implementing a teamwork training curriculum in emergency medicine: Clinical error rate decreased from 30.9 percent to 4.4 percent in the experimental group; ED staff attitudes toward teamwork improved; staff assessments of institutional support showed a significant increase.” (For more on Rhode Island Hospital’s simulation-training program, click here.)
Cook’s JAMA study showed a weaker correlation between enhanced simulation training and superior patient outcomes. This may have owed in part to the difficulty of accurately tracing such connections, Cook allowed.
“There were some weaknesses in that [patient outcomes] part of the study design, because there could have been other things that happened in the environment that led