Lovewerk

Interactive Audio with Unity, Ableton Live, and Max/MSP

 

A few years ago, as a music composition student, I was interested in using Unity to create interactive visuals for musical performances. I had already spent some time working with the visual scripting tool Max and thought it might be cool to learn how to combine the two. My research led me to UnityOSC, a project which allows you to send and receive OpenSoundControl (OSC) messages from Unity. OSC messages allow you to send messages between different programs.

Armed with UnityOSC, I set out trying to create a basic system for imitating spatial audio. Since Max is designed specifically for processing audio, my plan was to do all audio processing in Max and send control data from Unity in real time. So, using the assumption that all sounds have a source, I designated certain objects in the scene as sound sources and the camera position as the position of the listener’s ears. Each frame the listener position is sent via OSC to Max where each sound source (which I named VirtualSounds in Max) will pan its audio according to the angle between the listener’s forward vector and the line between listener and sound source. The amplitude of the sound source is also modified according to distance from the listener, and a low pass filter is applied as distance increases to mimic real-world high frequency attenuation.

Here is a demo of the system in action (unfortunately the volume is quite low):

You may have also heard the audio occlusion effect around the 1:00 mark. This is done by casting a ray from the sound source towards the camera and, if it is obstructed, sending a message to Max to apply a low pass filter to the sound according to how occluded the sound source is. This doesn’t account for bounced/indirect sound, but it still sounds pretty convincing. Here is a snippet of the occlusion code.

 

Shortly after creating that spatial audio demo, I decided to switch my focus to creating an interactive demo for my graduation recital. By sending OSC messages to Unity from my audio software (Ableton Live in this case), I can trigger different behaviours. Music software uses MIDI data to express musical information like pitch and velocity (loudness). A “note on” message consists of one integer which represents the pitch (where 60 = C, 59 = B, 61 = C#, etc.), and another integer which represents the velocity. A velocity value of 0 is interpreted as a “note off” message. So, in the demo video linked below, each note in the marimba and bass synthesizer is sent to Unity as a note on message, and when it ends, as a note off message. Using the classes AdsrEnvelope and AdsrBehaviour, which I designed to function like ADSR envelopes, received “note on” messages can be used to call Trigger() in the AdsrBehaviour which will then move a value through the attack, decay, and release times (sustain is an amplitude level, not a time interval). If the behaviour should be cut short when a “note off” is received, Toggle(false) can be called on the AdsrBehaviour to immediately jump to the release phase of the envelope. AdsrBehaviours are used on the glowing rocks and the beams of light in the distance.

 

The camera is using a Cinemachine dolly track, and it’s position along that track is dictated by an automated parameter in my music software, meaning that the camera position could be controlled by any value in the music software (such as amplitude of a digital signal, a hard-coded curve automation, or in my case, a hardware fader).

 

And finally, here is an example of using OSC and a game controller to control a synthesizer. The right joystick controls the pitch, while left trigger controls amplitude. The dpad up and down buttons allow a range of 3 octaves, and left stick press switches between major and minor scales. It is difficult to use this setup precisely, but it could be simplified for use in a game. (The synthesizer tone is a little unpleasant, so I suggest lowering your volume)