Students at the College of Arts and Sciences are seeing and experiencing remarkable places—without ever leaving the classroom.
Wearing mixed-reality headsets, they can walk amid 3D holographic images of a 2,000-year-old Greek religious site; explore the three-dimensional geologic contours of a Hawaiian volcano; and examine the inner workings of a centuries-old computing device.
A Case Western Reserve University innovation center pioneered development of these immersive journeys. It created the first education application for Microsoft HoloLens headsets in 2014, one to teach anatomy to medical students.
In more recent years, this center—the Interactive Commons—has worked with faculty at the college and across campus to create about 20 customized apps for many disciplines. The technology allows students to interact simultaneously with holographic images, classmates and their professors—an experience known as mixed reality.
“The level of insight that mixed reality brings to a broad range of disciplines is just unattainable through traditional teaching methods—and it’s increasing student knowledge,” said Mark Griswold, PhD, a CWRU professor and faculty director of the Interactive Commons.
Pilot studies have found, for example, that CWRU medical students learn anatomy twice as fast with the HoloAnatomy app as they do dissecting cadavers. At the college, early HoloLens pioneers include art historian Elizabeth Bolman, PhD, the Elsie B. Smith Professor in the Liberal Arts, whose students use the technology to “visit” Egypt’s fifth-century Red Monastery; and dance professor Gary Galbraith (CIT ’86; GRS ’88, dance) who has choreographed dances in which holographic images change location based on performers’ movements.
Academic leaders hope to see the technology used campuswide by 2030—and future students arriving with their own headsets in backpacks, said Griswold, also the Pavey Family Designated Professor of Innovative Imaging-Revolutionizing the Worlds of Education and Medicine.
“It’s going to be the way of the future of learning,” he said. Read on to explore a few examples of how the technology is used across the college.
“The level of insight that mixed reality brings to a broad range of disciplines is just unattainable through traditional teaching methods—and it’s increasing student knowledge.” —Mark Griswold faculty director, Interactive Commons
Neptune and Pluto are the planets farthest from the sun—3.7 billion miles in the case of Pluto, compared to just 93 million for Earth. But the galactic reality of that distance became clear in intriguing new ways as students using a customized HoloLens app searched for the holographic orbs.
Neptune wasn’t in the classroom and the more distant Pluto appeared as a faint dot.
“The biggest part for me was [seeing] the proximity from each planet to another,” said Katelyn Lamm, now a junior.
That’s precisely the takeaway Professor Steven Hauck, PhD, wanted. To make it happen, Hauck said, he worked first with staff at the Interactive Commons, relying on their “incredibly deep expertise in developing mixed- reality apps,” and then with University Technology, which has the know-how to “integrate these apps into courses and provides support during class time.”
Hauck, chair of the Department of Earth, Environmental, and Planetary Sciences, also uses a HoloLens app and headsets in geology classes. He’s found that students show more curiosity and gain a better understanding from 3D holographic images.
That’s true for senior Sebastian Cangahuala. When he looks at a 2D topographical map, it takes time to review elevation numbers and mentally construct a landscape’s geographic contours and layers above and below Earth’s surface. “But with the HoloLens, the 3D model is right there for me to see,” he said.
Pilgrims entering a sacred sanctuary on the Greek island of Samothrace more than 2,000 years ago would have had an “affective, emotional experience,” said Maggie Popkin, PhD, the Robson Junior Professor and an associate professor of art history. It’s an encounter that is difficult to replicate even when visiting the Sanctuary of the Great Gods at Samothrace because the site is in ruins.
Yet, thanks to a HoloLens app, students can “ascend” holographic rows of white and purple stone in the sanctuary’s outdoor theater. And they can relate to the awe and sense of revelation visitors felt as they reached the top and saw the landmarks below as well as the Nike of Samothrace, an iconic Greek statue that once soared above the theater and faced the Mediterranean Sea with wings sculpted in motion.
“There are all these visual lines, and conversations between all these different architectural features, that you just couldn’t imagine if you weren’t in the space,” said Madeline Newquist, a doctoral student in art history.
Popkin worked with the Interactive Commons to develop the app, using data collected for decades and 3D models created by American Excavations Samothrace, an international research consortium. She said the immersive app democratizes the experience for students and researchers who can’t afford to travel to Greece and “gives a much more visceral, memorable and meaningful sense of what it meant to move through this space in antiquity as somebody coming to have this kind of transformative religious experience.”
Immersive Technologies and Sacred Spaces
Thanks to a recent college Expanding Horizons Initiative grant, art historian Maggie Popkin, PhD, and Justine Howe, PhD, an associate professor and chair of religious studies, are developing an interdisciplinary course—“Embodied Religion and Mixed Reality”—to explore how immersive technologies, such as HoloLens, shape people’s experiences of sacred spaces. It will be offered starting this spring. They also plan a book on the subject.
During a spring biology class, students stood in clusters around holographic images of a human body. At times, they circled the images before them, simultaneously bowing their heads to examine the inferior vena cava, the great vessels of the thorax and abdomen and the intercostal muscles that help form the chest wall.
The experience allowed them to take “a closer look at each different part of the body” and then pull back to examine the entire system, said Anoushka Gidh (CWR ’23), a senior at the time.
Welcome to “Human Anatomy and Physiology Laboratory,” the first undergraduate class on campus fully designed and developed around the HoloAnatomy Software Suite that campus collaborators initially created for medical students.
“It’s a world-class medical education tool brought to the undergraduate level,” said Cheryl Knight, a University Technology senior learning designer who worked throughout the semester with the biology faculty members who integrated HoloAnatomy into their course.
The immersive experience gave the undergraduates a better way to understand the relationships between organs and other body structures. Traditional plastic models and specimens served as supplementary learning tools.
Experiencing the body in two ways allowed students to solidify their knowledge, said Richard Drushel, PhD (WRC ‘84; GRS ‘93, biology), who co- created the course with fellow senior instructor in biology Susan Burden-Gulley, PhD (GRS ’95, neurosciences).
With HoloLens, “they can step inside the body, look around and better appreciate system functions,” said Burden-Gulley, who received a teaching innovation grant from the college’s Expanding Horizons Initiative to co-develop the course.
In 2010, Associate Professor Paul Iversen, PhD, began studying the Antikythera mechanism, likely constructed during the first century B.C. and considered among the world’s first analog computers. The size of a shoebox and constructed mostly of bronze, it could predict the timing of eclipses and track and visualize the astronomical positions of the sun, moon and five planets visible in antiquity. And by indicating the correct year of six panhellenic athletic competitions, including the Olympic Games, the device helps scholars unravel mysteries about ancient events.
Today the mechanism sits behind plexiglass in Athens’ National Archaeological Museum, a disconnected series of fragments carefully pried apart after its 1901 undersea discovery as a fused-together lump.
Iversen initially used high-resolution 3D images to decipher the mechanism’s inscriptions and analyzed its calendar. Then in 2021, Griswold from the Interactive Commons approached him about creating a HoloLens app to better study the device. To do that, Iversen first needed to hone his understanding of the mechanism’s inner workings by moving painstakingly through X-ray images, sometimes one-tenth of a millimeter at a time.
“As an epigrapher [transcribing ancient texts] and historian, I had never really paid close attention to the gears,” said Iversen, who chairs the Department of Classics. “But this detailed work helped me to understand how they worked.”
And that in turn changed how he interprets the difficult-to-read inscriptions. He now knows, for example, how the gear sizes provide clues about the length of planetary celestial periods referenced in the mechanism.
This ancient computer is “about human discovery,” said Iversen, who has used the HoloLens app to expose students and many off-campus organizations to the device. And now, through the power of 21st-century technology, he’s better able to further his research and share what he calls “a scientific wonder of the ancient world.”
For Assistant Professor Rachel Mulheren, PhD, the HoloAnatomy app is indispensable in the undergraduate class she teaches on the anatomy and physiology of speech and hearing.
It enables students to see connections that will be crucial in their careers, like understanding which parts of the nervous system could be affected by a stroke—and what that means for speech and language functions.
Anthony Zogaib, a senior, said the holographic images provide an up-close and detailed look at the auditory system, including the ear canal and middle and inner ear.
“Normally those are really hard to see because they’re completely surrounded by your skull,” Zogaib said. But using a kind of mid-air holographic control to subtract other body parts for a better view of the system, he added, “we were able to see them to scale.”