2017 2018

August 20, 2019

2017-2018

In a rite of passage repeated with every new medical school class, students begin to learn the language of medicine by making their first cuts into cadavers, exposing the intricacies of the human body. 

But as essential as cadaver dissection is for learning anatomy, relying solely on that process has its limitations.

“When students do their cadaver dissections,” says Michael Schwartz, associate professor of neuroscience and associate dean for curriculum at Yale School of Medicine, “they are learning on a model that has changed significantly from its living state.” Working with a cadaver, he says, is essential to sensory-motor learning — the feel of organs and body structures. But the spatial relationships and organization within the body change dramatically after death. The fixatives used to preserve the body cause differential organ shrinkage, organ locations shift, and blood vessels look very different in stasis than they do when blood is flowing through them.  

A man in VR in front of a model of the human skeleton.

To address these discrepancies, and to accommodate the widest possible range of student learning styles, a team from the medical school, with the guidance of mentors at HP, has been exploring the use of virtual reality (VR) technologies to augment traditional methods of teaching human anatomy. Working with a combination of purchased three-dimensional model sets of brain and vasculature structures, and imaging data generated during diagnostic procedures such as CT and MRI in living patients, Schwartz and his team are creating VR models and experiences. The idea is to give students additional tools for understanding the three-dimensional organization of body and organ systems as they appear in living humans.  

An important element of this team is the medical school imaging group that provides DICOM (digital imaging and communications in medicine) data sets that form the basis of the VR models. The available data sets include not only data stacks for every anatomical system, but also for many of the pathologies that affect these systems. 

Since members of the initial team work primarily in neurology and the neurosciences, they are currently focused on applying these technologies to teaching about the nervous system. But the idea, says Gary Leydon, associate director for technology services, Yale Medical School Teaching and Learning Center, is that whatever is learned would be applicable to any other body system. “We are just beginning to explore the possibilities of VR space for teaching anatomy,” Leydon says. “Maybe we could create a VR model that would allow students to tunnel through the chambers of the heart.”  

Both formative and summative assessment capabilities are being built into the VR models for use in the classroom and for self-study. “Will using VR to manipulate and interact with anatomical structures generated from MRIs, CT scans and other imaging modalities enhance the ability of students to acquire a spatial understanding of these structures in health and disease?” Schwartz asks. “We think these technologies provide an enormous opportunity, but we’ll know more as this work continues over the next year or two.”