2017 2018

August 20, 2019

2017-2018

Plants build themselves out of air, water, light, and the nutrients in the soil. Embedded in the opaque tissues of their roots, leaves, and stems are intricate vascular systems that move water from the soil to the leaves through microscopic vessels smaller than the diameter of a human hair. Hundreds or even thousands of these water conduits can exist inside a stem the thickness of a pencil.

Since the introduction of the microscope in the late 17th century, investigating how plants function has meant primarily examining specimens on microscope slides, and traditional teaching methods have presented plants’ complex vascular anatomy as flat, two-dimensional images. A challenge in teaching plant physiology, says Craig Brodersen, assistant professor of plant physiological ecology at the Yale School of Forestry and Environmental Studies (FES) is “teaching students the relationships between form and function in structures they cannot see with the naked eye.”

For several years, Brodersen’s lab at FES has been using high-resolution X-ray micro-computed tomography (microCT) — a technique based on the same principles as medical CT scanning but on a much smaller scale — to collect data on complex plant anatomy. Jay Wason, a post-doctoral scientist working with Brodersen, has been using 3D visualization and computer modeling to study how four dominant tree species in the northeastern U.S. might be modifying their interconnected conduits (collectively known as xylem) in response to climate change.

A woman smiling at another woman as one passes a VR headset to the other.

During the fall 2017 semester, Brodersen and Wason tested the use of virtual reality (VR) techniques as a teaching tool in Brodersen’s graduate course in plant ecophysiology (the study of interactions between plants and their environments). Using VR headsets, students interacted with xylem models, picking up objects and moving them around in full three-dimensional views. Microscopic tubes and cells became structures the size of rooms that students could walk through while the instructors narrated what students were seeing. 

Two questions raised by this pilot program were: (1) whether the methodology could be scaled up from what was done in this relatively small class, in which 15 students used the equipment one at a time; and (2) to what extent 3D immersive teaching techniques enhance student engagement and learning. 

Currently, several of Brodersen and Wason’s interactive xylem models are available on a website that is easily accessed on any device (phones, tablets, computers). While interacting with those models in VR on a large scale is a challenge, Brodersen say that they are planning to use Google Cardboard headsets to scale virtual reality and 3D movie technologies to larger groups.

As to the second question, Wason says, “The engagement was definitely there! Students loved moving around inside the plants, and we have to make sure that we guide them appropriately.” In partnership with Yale’s Center for Teaching and Learning, the pair will assess the degree to which their 3D teaching modules are pedagogically effective as compared to traditional 2D teaching. And Brodersen and Wason are in the process of creating lesson plans that will allow other instructors to generate or acquire 3D models, movies, and virtual reality demonstrations that they can use in their own classrooms.

References and related information:Brodersen Lab, 3d Data Portal, http://campuspress.yale.edu/brodersenlab/3d-data-portal/