top of page
Screenshot 2024-05-12 at 11.54.28 PM.png
SYNC_wordmark.png

Role

Technical Design Lead

Project length

Jan-May 2024

Software

TouchDesigner

Overview

Sync is an interactive and immersive piano visualizer. It takes in MIDI and audio input from a piano keyboard and synthesizes corresponding visuals that dynamically flow and dance alongside the music being played by the performer. It is a spatial art piece aimed to pair with musical performances to create a new and captivating experience for both the performer and the viewers.

Demo Video

Design Process

This project began with experimentation. Designing a visual response based on music was an extensive process that went through countless iterations of designing and discussing amongst the team. This was only the start- creating the working product that displays these visuals based on the keyboard input and audio output required learning TouchDesigner, a program that was completely new to me and the team.

Learning TouchDesigner 

Alongside ideating on the visual design of our piece, The first month of this project was spent getting hands-on with the TouchDesigner software. After learning the fundamentals through a quick crash course on Youtube, I began experimenting with different interactive capabilities and using external devices.

A simple node network I made during my learning process that displays an animated circle in which the position is mapped based on the external MIDI input.

I experimented with several interactions, with this one displaying "strings" that simulate being plucked when its respective key is pressed.

The cylindrical geometry was used as a simulation for how the visuals would look inside the Igloo 360 space we have in our lab.

After weeks of iterating and testing different visuals, we established 4 distinct environments and consolidated them into a centralized file in TouchDesigner.

User Interface

Each of the visual environments we created was designed with parameters that the user would be able to adjust and manipulate to tailor to their performing style. With the vision of a performer being able to tweak the visuals in real-time, we created a user interface that allowed them to do so. With this, the user would be able to switch between the different visual environments and adjust certain parameters how they'd like.

Architecture

With our completed visuals, it was time to project them into the Igloo space in our lab. Some technical considerations took place during this process. For one, the initial keyboard we rented had hardware issues, so we had to switch it out. Secondly, the PC connected to the Igloo projection system didn’t have strong enough processing power to run our TD files. To work around this, we used a separate, more hardware-intensive computer to run our file, which had better performance. We then connected the two computers via ethernet and sent the TD output via NDI In/Out operators. 

 

We then had to link the keyboard in the Igloo to the computer outside. This was done using a MIDI to USB adapter with an extra-long USB extension cable. We also rented a speaker and microphone, so that the microphone would pick up the sounds from the keyboard and affect the background visuals.

Diagram showing the architecture of our project setup.

Image of our team setting up our project in the Igloo space.

Final Display

bottom of page