2020

September 1, 2020

Despite the challenges posed by the COVID-19 pandemic, work within the Blended Reality project continues. Principal investigator Justin Berry is guiding and leading a wide range activity, with teams exploring everything from a wheelchair-based virtual reality(VR) controller to new ways of teaching art and digital design courses. Highlights of project work are outlined below.

During the Fall of 2020 the Blended Reality community will gather every 2 weeks for a series of conversations with industry leaders and project teams. These meetings provide opportunities for Yale students and faculty to share their work, learn from each other and get involved in extended reality (XR) project work. All of the projects currently underway benefited from the conversation and cross-disciplinary collaboration that is at the heart of the Blended Reality community. Attendance is open to everyone at Yale. Email randall.rode@yale.edu to get on the invitation list.

Project highlights:

The Clamshell project: Justin Berry, along with former students Lance Chantiles-Wertz ’19 and Isaac Shelanski ’20, are building a device that will allow creatives to build new types of interactive experiences and interfaces using a wide range of sensors that can be used in hours rather than weeks. Simplifying the process of systems like Arduino, this lets you make more compact and robust interfaces that you can use for real projects and update or adapt on the fly. A final version of the device is currently in manufacturing to be used by projects within the Blended Reality program this fall in a beta test.

Lance Chantiles-Wertz (YC ’19) and collaborators

Turing Project: Matthew Suttor, Dakota Stipp, Farid Abdul, Liam Bellman-Sharpe, Hugh Farrell
In collaboration with a team of theatre artists, programmers, and designers, School of Drama faculty Matthew Suttor is composing an opera about Alan Turing. I AM ALAN TURING is not a biographical drama, but a theatricalized Turing Test in which any of the performers may or may not be Alan Turing. The libretto, drawn from material generated by OpenAI’s GPT-2 natural language model trained on Turing’s writings, leverages artificial intelligence technology against what it means to be human. The team from New Haven, New York, Vancouver, and Dublin will present work in progress as a part of CCAM’s Wednesday Wisdom series, November 11, at 7:00 PM.

Wheelchair Driven Interface for VR: Mary Ben Apatoff, Justin Berry, Yetsa Tuakli-Wosornu
This summer, we worked on the initial stages of a wheelchair driven interface for virtual reality. Due to the ongoing Covid pandemic, much actual testing and building stalled; however, we made the most of our remote situation. We laid out questions, narrowed down functional and design goals, designed and ideated on different aspects of the interface, connected with collaborators, and even began to test our ideas on a real wheelchair and interface. This project aims to reimagine what the interface for virtual reality could look like and to explore what happens when you flip the script and discover opportunities, rather than limitations, from those that are differently abled.

Detail from the wheelchair controller prototype

Teaching Art in VR: Justin Berry, Anahita Vossoughi
School of Art professors Justin Berry and Anahita Vossoughi are seeing how far remote learning can be pushed, providing virtual reality systems to all of their students and holding class sessions in VR. They are testing out what a virtual classroom can actually look like and seeing what it means to work with students to create their own unique learning environment.

Peabody Diorama Capture Project: Michael Anderson, Collin Moret, Kailen Rogers
As part of the multi-year renovation and expansion project of the Peabody Museum of Natural History the nature dioramas will be completely renovated. During summer 2019 the Blended Reality project helped test out digital capture techniques on one of the dioramas. Building on that work, students and faculty working with Blended Reality will be part of the team taking the glass off of these historic dioramas and using photogrammetry to transform them into virtual models that can be experienced in virtual and augmented reality.

Mobile Point Cloud Capture: Randall Rode, Farid Abdul, Christina Strohmann
This team is reviewing hardware and software technologies to support creating 3D video and still captures for use in 3d virtual or augmented reality experiences. Large scale, expensive and complex 3D capture systems have been used in professional production for years. The goal of this project is to test off the shelf systems that utilize simple cameras, low cost software and standard computers. And to make it possible for faculty and students to be able to readily record or live stream 3D objects and actors for use in their VR/AR projects.

Testing the Kinect depth camera SDK

The Verb Collective: Produced by Justin Berry and recent graduate Bobby Berry, this toolkit for creating interactions in virtual reality and for desktop applications is soon to be updated with new features. Available on the Unity Asset Store this kit breaks complicated code structures down into tiny, well documented, fragments, making it easier to get started but also easier to learn the most commonly referenced functions