undefined logo
Music Generators

Semi-Conductor

Interactive orchestral conductor in-browser

Information about Semi-Conductor

What it is

Semi-Conductor is a browser-based interactive experiment that converts a user's arm motions into controls for an orchestral performance. It runs in the web page and uses the PoseNet machine learning model to detect body poses through a webcam, interpreting gesture data to alter musical parameters such as tempo, volume, and instrumentation. An internal algorithm reads a score and plays alongside the user's conducting, assembling audio from a large collection of very short samples sourced from live instrument recordings. The project was created by a Google Creative Lab team in Sydney (Rupert Parry, Melissa Lu, Haylie Craig, Samantha Cordingley) and published in December 2018. The experiment is noted on the site as no longer active, while the page retains documentation and access to the code.

Key features

The experiment provides in-browser, real-time motion tracking via PoseNet to translate detected poses into musical controls. Users manipulate tempo, adjust volume, and switch instrumentation by moving their arms in front of a webcam; those gesture inputs are interpreted continuously so motion produces immediate changes in playback. Sound output is sample-based, constructed from hundreds of very short audio files recorded from live instruments, and synchronized with an algorithm that follows a written score. The implementation relies on TensorFlow.js to run client-side in the browser. The project page includes options to launch the experiment (when available) and to retrieve source code, and it displays build credits and links to privacy and terms information.

Use cases

Semi-Conductor is presented as an interactive demonstration of machine learning applied to music and gesture control within a web environment. It can be used by people who want to experiment with conducting an ensemble from a browser, by educators or students exploring pose-detection and real-time audio synthesis concepts, and by developers seeking an example of TensorFlow.js and PoseNet deployed client-side. As part of the AI Experiments collection, the project serves as a creative technology showcase and a reference implementation for data-driven, webcam-based interaction tied to musical output.

Stay in the loop

Get notified about new AI tools and updates.