2017 2018

August 20, 2019

2017 – 2018

Musical instruments have come a long way since the Upper Paleolithic period — about 40,000 years ago — when the first modern humans blew into flutes crafted from bird bones. Today’s music-makers have a vast array of woodwinds, brass, percussion, strings, and woodwinds, that can be bowed, blown, struck, plucked, or shaken to create anything from jazz riffs to symphonies. 

And they can also make interesting musical sounds by moving little wooden blocks across a visual field. 

Two students working with Max MSP.

This past year, Yale College students worked with Konrad Kaczmarek, assistant professor of music, in Yale’s Music Technology Lab to experiment with ways to use technology for music making. “The question we were exploring,” says Kaczmarek, “was how do we integrate tools associated with immersive technology — technology that blurs the line between the physical world and digital or simulated world  — into music production, sound synthesis, and collaborative performance.” 

Using the HP Sprout immersive computing system, one group of students developed a way to make wooden blocks of various shapes trigger musical sounds, acting as a type of virtual music box. With the blocks placed on the Sprout’s projection surface, the Sprout’s downward-facing camera read the blocks’ positions, orientations, and shapes. The project was led by students Soledad Tejada (Yale College ’20) and Tomaso Mukai (Yale College ’19) with assistance from Nikola Kamcev (Yale College ’19).

Wooden blocks with designs projected on them on top of a green grid.

To enhance the visual appeal of the experience, the students used projection mapping — having the Sprout’s projector overlay moving images on the blocks (for instance, making the semi-spherical ones resemble turning soccer balls). The controller, programmed in MAX (a visual programming language for music and multimedia), contains a moving electronic bar that acts as a timeline and sweeps across the objects, triggering musical sounds as it reads the location, shape and orientation of the blocks. The visuals projected onto the blocks also changed with the timing of the music. Because the visuals are stimulating the sound, the musicians compose by moving the blocks around the Sprout’s projection surface.

Kaczmarek is particularly interested in the interface between the aural and the visual.  Outside of his work under the Blended Reality research program, he recently guided a senior composition students, Jack Lawrence, in using the MAX program to create an immersive visual landscape composition featuring guitar, singers, piano, trumpet, and cello.  

Kaczmarek is interested also in how immersive technologies can be used to augment or contribute to collaborative music-making — for instance, a group working together remotely on the same musical project. Another focus is gesture-based music creation — using one’s full physical presence as a way of controlling various aspects of sound. “These technologies are becoming part of our everyday lives,” Kaczmarek says, “I’m glad to have enthusiastic students who want to test their possibilities for music creation.”