The parts, gathered in the basement of Runnals by computer science major JP Perales ’21: one Feather Huzzah ESP8266 for sending wireless data, one BNO055 for detecting movement and orientation, an on/off switch for managing battery life, a 10K resistor, and a lot of cables—yellow for data, red for power, black for ground. Forty minutes later, and voila! The sensor was ready for rehearsal.
Welcome to Colby theater, 2019.
Flesh and Spirit
With her lifelong scholarship in Sikhism, Nikky-Guninder Kaur Singh seeks a better world for all
In Last Lecture, Nadia El-Shaarawi Makes a Case for Interdependence
Charles Bassett Teaching Award winner emphasizes the importance of care
Senior Standouts 2023
These graduating seniors represent the academic breadth and excellence of Colby’s Class of 2023.
On this stage, an arm swing was enough to change the course of the show. Through the sensor wrapped around their wrist, each performer could trigger sounds or manipulate projections in real time. This was a new kind of artistic freedom and one that explored the interaction between humans and technology. Blending together theater, music, and technology, Strings set a new precedent for interdisciplinary production.
“This kind of experimental theater production is not something that you typically see in a small college in Maine,” said Associate Professor of Composition and Theory Jon Hallstrom. “The fact that the students were willing to sort of hurl themselves off the beaten path was pretty exciting, and very unusual, too.”
Strings came to life as the capstone project of Colleen Wright ’19, a music interdisciplinary computation major, and Jay Huskins ’19, a theater interdisciplinary computation major. “What I like about the interdisciplinary major,” Wright said, “is that it lets me have a lot of freedom to explore different ways that music and computer science can coexist.”
In Strings, they did just that.
But how to do it? The two pondered over the summer and decided to go beyond creating a piece that “looked cool and did cool things,” Huskins said. “We knew we wanted to use technology as a representation of the real world,” Wright said. “We were hoping to examine and ask questions about how people interact with each other and with their environment.”
They would set out to design a work in which performers could impact their environment through technology and consequently be affected by their actions.
Rather than a written script, there would be character archetypes: Spotlight (Sam Barry ’20); Shadow (Cole Walsh ’19); Seeker (Brianna LaValle ’22); Activist (John Baker ’19); Accepting (Will Sideri ’20); and Hustler (Sakina Mustafa ’22). All had a super objective and power level (either high or low) determining how they could engage with their environment.
“I assigned them based on the feeling that I wanted for each character,” said Wright. She wanted Walsh’s character, the Shadow, to be overpowering and therefore gave him low pitched, long-lasting sounds. Baker, as the Activist, got very short and punctuated sounds to fit his sharp actions. Spotlight could send a static noise to the projections while others could alter the contrast, blur images out, or change the hue.
To trigger these personalized effects, there needed to be a device. That’s when Perales’s sensors came into play. “I never expected to work on something like this,” Perales said. “So far, for me, computer science has been numbers and things that come out on screen, whereas here it taught me that it can be more than just that.”
Besides the sensors, there was another piece of key technology: the laser. “JP’s motion sensor is an individual interaction system about how one’s motion can make a difference,” said computer science and art double major Heidi He ’19, adding, “And the laser is corresponding to how the world responds to us.”
To do so, He programmed the laser to scan the empty stage before each performance. The system then compared these initial readings against the new data coming in during the performance. When the performers entered the stage, their presence was detected by the system as they broke the laser’s infrared lights. Consequently, the computer triggered a sound to be played at that given moment. Proud of her work, He plans to share this code on GitHub—an open-source coding platform. Not just her code, but she hopes all of the technology in Strings could be used by other students in the future.
“They ran into all the technical challenges that you run into when you try and make something that’s practical, that works in real time and is robust,” said Professor of Computer Science Bruce Maxwell. Sensors’ batteries died during rehearsals. Programs needed to be debugged. Computers slowed down just before the show. “This was I think the biggest, most extensive thing that’s happened so far with this major. And it was definitely the most ambitious in combining projection, sensors, sound, and choreography,” he said. “I think it turned out really quite well.”
To develop the show’s structure, Magnus Pind Bjerre, a part-time faculty at Parsons School of Design at the New School, joined the team. With years of experience in professional multimedia performance arts, he said most productions usually plug into existing systems in theater houses. “Here it’s quite different because the whole way of making the work and showing the work has been developed around custom-built devices and custom-built technology,” he said. “We have used existing software and existing software protocols, but in a custom-made way.”
The level of connectivity between all the systems running together in the piece was unusual too, he noted.
“When you take our use of interactivity and then you take the fact that we have students engineering and creating devices to trigger the software—that’s where I think it gets pretty unique, especially at a liberal arts level,” said Associate Professor of Theater and Dance Jim Thurston. “The fact that we just imagined as a whole collective to do this work and achieved it makes us competitive even in that professional context.”
Looking ahead to Colby’s plans for a new arts and innovation center—and its emphasis on interdisciplinary work—Thurston said Strings was an intentional test to see what that could be like. “With a work like this, an original work, you keep moving big building blocks into place, but you’re not one-hundred percent sure where you’re going to end up,” he said. “I think this surpassed all of our dreams.”
Sensor in the making
The sensor technology evolved over time to suit the show. Along the way, Assistant Professor of Computer Science Caitrin E. Eaton proofread the sensor’s internal structure and helped Perales efficiently attach all the components. For instance, they decided to add a power switch, along with a button for performers to active the sensor during the performance. To connect those buttons, Perales first used solid wires, but found that they were too stiff for the purpose. Eaton suggested he use stranded wires. “Now it’s very flexible, especially when wanting to attach it to a glove,” he said.
Why mount the sensor onto a glove? Because they found the wrist to be the best place for the sensor to pick up movement. Early in the process, they tried putting it higher up in the arm, which didn’t work as well.
To attach the sensor to the glove, Perales designed a 3D box using a program called Autodesk Fusion 360. Made of PLA plastic, the box had special slits for a wristband to fit through. “It [the slit] also plays a nice little feature in holding the battery in place,” he said. The boxes were turned out by the Prusa MK3 printer in Colby’s Mule Works Innovation Lab. When printed, the sensor’s circuit board was too big to fit into the box. Perales used a Dremel tool, used in woodworking, to resize the board.
How did he learn to use the tool? “I looked up YouTube videos,” he said.
Once assembled, the next step was programming the device. Using a program called Max/MSP—a visual music programming language—Wright integrated different sounds into the performance. “We had the data coming in from the sensors and then that would go through a series of statements, trying to figure out which direction they [performers] were moving [their hand],” she explained. Depending on the direction of the hand gesture, the system would send a signal to play an assigned sound, which would then trigger the sound and send it to the speaker. When it didn’t work as expected, Perales acted as the sensor emergency responder. Always backstage, he was ready to debug or trigger the backup system at all times.
Reading the stage
Heidi He ’19 inherited the laser from Jerry Diaz ’19, who programmed it to talk to a computer. “There was a lot of digging that we had to do online,” he said. After time on internet forums, trying to figure out the code, Diaz and Wright built a program that converted the laser’s programming language, C++, to the computer programming language Python. When He started working on the laser, she ran into her first problem; her Mac couldn’t see the laser. With more research and Diaz’s help, she jumped over that hurdle and began programming the laser for Strider Theater. Initially, the laser was placed on the floor, but gave unclear data as the laser couldn’t distinguish between an object, a foot, or two feet. The solution: put the laser on a tripod placed in front of the stage. Through infrared lights, the spinning laser scanned the performers’ hips and all other objects on stage fell below its radar.