2018 2019

August 20, 2019

2018 – 2019

Mention virtual reality (VR) to the average person and inevitably the conversation starts with first-person shooters. They imagine themselves trespassing across a scarred battlefield or in a dungeon or post-apocalyptic landscape, holding some sort of gun in front of them as they try to kill as many zombies/demons/aliens as they can.

And, sadly, that’s kind of accurate. Despite the potential for many innovative, society-changing applications for VR, the most popular experience in the medium is shooting someone or something.

A group of Yale students, Lance Chantiles-Wertz (’20, mechanical engineering), Isaac Shelanski (’20, physics) and Sara Abbaspour (MFA) decided that the medium could do better, and they set about to make it easier to create more versatile, perhaps more creative and inclusive experiences in VR. 

Their starting place? Reimagine the VR controller.

“Existing controllers have one main action. You point and you click—just like a gun. As an artist, I don’t really have that type of interaction,” Abbaspour said. “We want to be able to build virtually any controller that can support any experience that anyone can imagine.”

There was one catch: there was no way that a single controller—no matter how innovative and different—would be able to enable any experience that anyone could imagine. If you think about it, Shelanski explained, using your hands to pet a mythical creature and conducting microscopic laser surgery should require a different tool.

Building a Platform Instead of a Controller

As the team investigated different designs and experiences, one thing became clear. The controller would have to be modular and built as an open source platform that could have different sensors plugged in and swapped out at will without requiring extensive coding. That way, anyone could easily customize the controller with different sensors depending on the experience they wanted to create.

For example, a developer building an experience that allows immobilized users to drive a racecar could add a motion sensor that tracks eyeball movement. Pressure sensors on a glove could mimic the sensation of molding clay or holding something in your hand. Or magnets could be used to simulate friction or textures. The possibilities are endless.

“I’ve always thought that VR could be a powerful empathy engine that allowed people to interact with other people in new, interesting ways. This team is proving that,” said Justin Berry, principal investigator for the Blended Reality program.

The Genesis of an Idea

The idea of a new controller was born at a conference that Berry and Chantiles-Wertz attended in Ireland. By the end of the trip, the two had sketched out a variety of new controller designs ranging from a rose to a clamshell that fit over the hand like a glove. When he got back to campus, Chantiles-Wertz teamed up with Shelanski to build out the electronics out of an Arduino controller and Spark circuit board while Abbaspour put the finishing touches on several designs, including what the trio dubbed the clamshell.

The next step is to figure out how to connect various sensors to the controller in a way that isn’t intimidating for the non-developer—and then get that data to seamlessly flow into the Unity game engine. According to Shelanski, a series of USB ports or Bluetooth could be potential options. Or machine learning could enable the controller to configure itself when a new sensor is plugged in—whether it has seen that type of sensor or not.

The semester is winding down, and the team is in the prototype stage. They still need to manufacture a working prototype based on Abbaspour’s clamshell design, but it’s easy to see how their controller could be a conduit for adding various sensors that enable different experiences. 

The hope is that a more inclusive controller can spark new, innovative ideas for different VR applications and convince non-gamers to want to interact with those environments. 

With their help, that reality is just over the horizon.