MOTIONMATRIX

Category
AI
HCI
Location
New York, NY
Timeline
2025
Collaborator
Zijie Zhou, Bo Li
Info
MotionMatrix is a gesture-controlled audio-visual prototype developed for the Design for Physical Interaction course. The system connects a wearable glove interface, an LED matrix display, real-time audio manipulation, and generative visuals into a unified performance environment.
Using motion-capture sensors embedded in the glove and a communication protocol based on MQTT, performers can control both sound and light through hand gestures. Each LED pixel acts as a dynamic visual unit, creating a responsive “living pixel” surface that reacts to movement and sound in real time.
I was primarily responsible for developing the TouchDesigner visual system, audio control and manipulation, the LED screen visual logic, parts of the glove interaction design, and the communication protocol connecting all components.
This implementation serves as a prototype for a larger gesture-controlled performance system, exploring how wearable interaction devices can orchestrate immersive audio-visual environments across larger spatial installations.
Final outcome



project statement

This project was inspired by Imogen Heap’s Mi.Mu gloves, a pioneering gesture-based music performance interface that allows performers to manipulate sound through body movement. Building on this idea, our project explores how gesture control can extend beyond audio to influence both sound and visual media simultaneously.
Rather than focusing solely on musical control, we investigated how embodied gestures could orchestrate a multi-modal performance system where movement directly shapes light, sound, and spatial visual feedback. By linking gestures to generative visuals and LED-based displays, the system transforms the performer’s body into a live interface for an immersive audio-visual environment.
Technical Architecture


Progress




Reflections
This project taught me how to integrate digital and physical prototyping within a single interactive system. Developing the project required coordinating multiple components, including the wearable glove interface, LED matrix hardware, generative visuals, and audio manipulation into a coherent real-time pipeline.
Through this process, I gained experience managing complex interaction systems that span hardware, software, and network communication, and learned how gesture-based interfaces can be translated into responsive audio-visual feedback for interactive performance environments.



