Creating Visual mappings of Sporth control signals
The Documentation Problem
Whenever I get the opportunity, I always to try to get my sound-based work to map into the visual domain. In the past, I've built games and interactive sequencers. These realtime interactive works have always proven to be very satisfying, but they are very hard to document for a portfolio. Screen capturing has never been a satisfactory solution, and is often not a possibilty for very CPU-intensive applications. You can film screens, but this diminishes quality. Remember too that you need to sync audio to that as well...
Before learning how to use OpenGL, I was very interested in generating video files. I would generate the sounds I wanted to in Csound, and then write programs to generate a bunch of images responding to the sound, which could then be sewn together into a video. It was a slow slow process, and the tools I built for the job were very brittle. It was a project I quickly abandoned for other more fruitful pursuits. Until now.
I'm giving audio-visual video creation another shot with Pixku. Pixku allows you to generate audio-visual compositions using Runt and a handful of runt-libraries. Inside of Runt, one can define a patch written in Sporth via runt-plumber. Using runt-plumber, one is able to call runt-functions from Sporth and get signals from Sporth to Runt. The beauty of this is is that Runt glues the audio and visual generation in one process (and in one file as well!). This slick integration I hope will provide some fruitful work in a similar vein and scope as Sporthlings.
Visual Mapping Tests
What is a good mapping? This is a perpetual question asked by new-music interface designers and audio-visual artists. As a starting point, I decided to use Pixku to generate some test videos that explore some elementary sonic-visual mappings. While not exactly musical, these simple examples provided some useful insight for future works.
Below are three initial mapping tests I made. Each example involves manipulating a circle in a 2d space using Sporth signals mapped to visual parameters and audio parameters. With each example I added one or two new dimensions of control. The main goal I had was to make it clear both visually and sonically what the mappings were doing. As I started adding more dimensions, more effort was needed to decorelate the signals so that they could stand out.
Each example below has a frame from the generated video. Clicking on the frame will download the video. Each video is 10-seconds long in length (approximately 200kb using libx264 and FFMPEG).
Initial Circle Test: mapping to radius
The first mapping test was mainly done to test out the software. It has a single dimension of control. In the audio space, the control maps to frequency. In the visual space, the control maps to the scale of the circle.
The control signal itself is a low-frequency sinusoidal oscillator (LFO), whose frequency is being modulated by another low-frequency oscillator.
Building on top of the previous example, this example adds control of position along with scaling. They have the following mappings:
- X axis: frequency
- Y axis: reverb level, filter cutoff
- Scale: amplitude
All the signals used are sinusoidal LFOs generated by Sporth. The XY signals are in-phase and quadrature sinusoids, which is a fancy way of saying they are 90 degrees out of phase from another at the same frequency. This causes the circular motion to happen.
This final example adds a mapping to color. In this case, it is a single value that scales the RGB value equally, which cause the "brightness" to change. In the audio domain, this maps to timbral color.
It took a bit of trial and error to get clear mappings. For one thing, I began noticing how similar signals caused mappings to melt into one another. For instance, timbre and volume both used period signals that moved at the same frequency. This caused these mappings to be grouped together both visually and sonically. Changing the frequency of one of the signals to something more unique decorrelated it and gave it it's own space.
To make things a bit more interesting, the frequencies of the XY signal were changed to produce a 3:4 lissajous curve instead of a circle.
In Sporth, the main unit generator in charge of producing the main sound source has been swapped from a subtractive sawtooth patch to a FM pair. FM oscillators have a much more dynamic timbral structure than a subtractive patch, which hopefully would make the sonic mappings to timbre more clear.