2016 2017

August 20, 2019


The MindDesign tool will jointly operate as a CAD digital plug-in and a physically interactive creation interface allowing users to fabricate design ideas directly from their imagination. It will read a user’s neural activity from emotive stimuli, using a commercially available EEG sensor headset to lower the technological skill barriers of object design and fabrication. Beyond basic functionality, the platform of communication between neurological activity and object design allows the user to identify a personal syntax of neurological stimuli for shaping forms.


  • Summer Sutton, Yale School of Architecture ’21 (PhD)

Featured technologies:

  • Sprout Pro by HP computer workstation
  • Dremel IdeaBuilder 3D printer
  • EMOTIV EPOC+ – Multi-Channel EEG System‎

How does what we imagine become what we create? Anyone who has ever seen a tree in the world can see a tree in the mind’s eye. But not everyone can take that imagined tree and draw, paint or sculpt that image into a form that resembles what the mind envisions.

For her Blended Reality project, Summer Sutton explored how thoughts and emotions might be converted into 3-D representations. She created the MindDesign tool, which she describes as “a design tool for generating physical objects from cognitive activity.”

Using an EEG (electroencephalogram) brain scanner and an immersive scanning workstation, the MindDesign tool operates both as a computer assisted design (CAD) software plug in and a physically interactive creation interface that allows users to fabricate design ideas directly from their imaginations. The tool links a user’s neural activity from emotive stimuli, using EEG sensor headsets with an algorithmic digital manipulation of a 3-D scanned object.

A brain scanner device reading electric signals from the brain.

As the person wearing the headset sees emotive stimuli on a computer screen, each of the fourteen sensors in the headset reports neural activity in a different region of the brain. Points and lines that make up base forms are manipulated on X, Y, and Z axes (three dimensions) based on the data coming from the sensors and visible on the computer screen. The user can also stimulate neurological activity that will in turn be picked up by the sensors. All of the data manipulate points on a 3-D form that is being modeled via CAD in real time. The user can then print a physical 3-D model of the emotions, memories or thoughts that were generated during the process.

A photo of the interface.

“The MindDesign tool,” Sutton says, “widens design possibilities while also bridging the translational gap between conceptual ideas and physical reality.”  She gives as an example a boyfriend who wants to create a unique piece of jewelry for his girlfriend. A brain scan while he thinks loving thoughts about her could be translated into a gift that represents their relationship.

While Sutton is still working on fully automating the process, she notes that the forms created with the MindDesign tool will truly be the product of blended physical and digital realities: the physical world (original scanned object and final fabricated object); augmented reality (visualizing the effects of live neurological activity); 3-D design (CAD software plug‐in); and computer graphic and fabrication tools. In addition, she says, “I see this tool as a way to create meaningful and unique objects that are not automatically generated from culturally or socially learned aesthetics, and to open up the practice of design to those for whom reliance on fine motor skills inhibits design potential.”