Gest

Gest

keywords: gest, gesture, gesticulate, howyousay, line, phrasing, prosody.

Overview

It all started with a half baked idea:

2020-07-29 17:19:40: a music sequencer, but for producing continuous gestures instead of discrete steps. have it be clocked with an external source so it can play well with others. perhaps build a notation system around it. #halfbakedideas

Gest (pronounced "jest") is a sequencer for gestures. Conceptually, it lies somewhere in between a classic step sequencer and an automation curve editor.

Timing in Gest is controlled via an external clock signal, referred to as the conductor signal. The conductor signal is expected to be a phasor: a periodic rising ramp signal in range 0 and 1. Each period is considered as a musical beat in time.

Once instantiated with a conductor signal, Gest can then be configured to produce gestures: audio-rate continuous control signals intended to modulate synthesis parameters (such as the frequency of an oscillator, or the cutoff of a filter). Gestures are chunked into groups known as phrases, which are blocked out to be a fixed duration in units of beats.

A phrase, once created, will analyze the conductor signal and resynthesize a new slower signal by a factor of N beats, where N is the duration of the phrase. This will produce a single ramp from 0 to 1 that extends the duration of the phrase. This particular operation is known as a monoramp. This monoramp acts as a blank canvas for the phrase, and can be equally divided up into an arbitrary number of smaller ramps. This operation produces a polyramp. Contiguous ramp segments in a polyramp can be merged together into monoramps, and then divided into new polyramps. This process produces a tree of ramps, known as a ramptree.

Every leaf node of a ramptree gets appended with a discrete scalar value known as a target. Using the normalized continuous values from the ramp tree, targets can be interpolated together to produce line segments. The way which a target gets to the next target is known as a behavior.

A behavior can be any function that takes in an alpha value, and produce an output that involves two values A and B. It is typically assumed that behaviors try to be contiguous: when alpha is 0, the output should A. When alpha is 1, the output should be B.

Some examples of behaviors can include: linear, exponential, step, and smoothstep.

Code and Documentation

Code for Gest can be found on sourcehut: https://git.sr.ht/~pbatch/gest.

Gest is written as a literate program using worgle. The woven output of this program can be found at gest_program.

In addition to the program itself, there is also a very concise that aims to introduce the concepts of Gest, using examples written in sndkit via the LIL scripting language. This can be found at gest_guide.

Updates

Be sure to check out the TODO page for gest to see what is being planned for the future.

Below are entries from my personal zet related to gest:

2022-01-24 12:50:43: the new rephasor algorithm doesn't work. I found a bug when I tried slowing it down by a factor of 0.25.

2022-01-13 15:55:45: not only would a VM allow for more procedurally generated gestures, but it would also open the door for the coordinating multiple gestures.

2022-01-13 15:54:17: working out a new algorithm for a rephasor that has the ability to stay somewhat synchronized with the external phasor. If that works, I can build a gesture synthesizer that is programmed with a VM, similar to (seqvm). That would be very cool.

2021-12-27 21:01:28: made a quick notation language for making gestures for (gest), inspired by what I've been using in scheme for things like (looptober_2021). Calling it (gestlang).

2021-09-16 10:04:21: scalars implemented in (gest) today, which required creating a thing called target actions. all this for gate signals, because I realized that for melodic sequencing, I wanted rests and silence. should be helpful for my #synthwavefromscratch project.

2021-09-05 12:19:05: metatargets implemented in (gest) today! I also created a nice little example that shifts between targets with varying weights, naturally warping the tempo of the sequence. A very chopin-y kind of compression and expansion of tempo, with very little notation required to get there.

2021-09-04 10:51:03: metatargets are the first of the metathings to come to (gest). metatargets allow multiple targets to be selected for a node. metabehaviors would allow different behaviors to be selected for a given target. metaramps would allow different subtrees to be selected for a given monoramp. I won't claim it will meet all my generative music needs, but wow would it be close if I could get these all in there working together in a nested way.

2021-09-04 10:39:59: this restructring/refactoring effort of (gest) I have been doing has been a slow and careful process spanning a few days. there have been many things to consider, and those things ended up having considerations too. should be worth it in the end, because metatargets will be a thing, which will yield more interesting procedurally generated gestures in the future.

2021-08-31 21:05:22: one of the more interesting things in (gest) are the use of behaviors, which dictate how two targets interpolate between one another. there's a lot of fertile ground for exploration here. lots of (howyousay) energy.

2021-08-31 21:03:08: major bugfix today with (gest) today. Now it is starting to feel useful. Next thing I am thinking about is how to incorporate more generative/procedural elements. Also, adding more behaviors.

2021-08-22 14:59:50: In scheming my next breathing card, I found myself wanting to be able to convert clocks into phasors to do frame-accurate video stuff and use (gest). I've put this off for a while because there's no great way to do it, but I think clkphs is about as good as it can get. Now I need to adjust gest to ignore negative conductor signal values and I should be good to go!

2021-07-22 12:09:28: wrote an initial notation system for composing gestures in (gest), loosely inspired by Tidal notation. It's currently implemented in scheme via (monolith). Feels promising.

2021-07-21 20:28:44: much debugging today in (gest), and what do I have to show for it? a melody that is a ripoff of rite of spring, being performed on a flat FM oscillator. baby steps in the right direction...

2021-07-20 21:02:59: ah. I sorta promised I wouldn't do it because (toys_not_tools), but I went and added exponential and bezier behavior to (gest). Now it does basically everything (libline) set out to do, but better. Goodbye libline, you were mighty in purpose and potential, but deeply flawed.

2021-07-19 19:54:35: some very initial (gest) bindings to (scheme) via monolith. should allow for more expressive notation soon.

2021-07-19 19:47:42: (gest) now has a (TODO) page: (gest_TODO).

2021-07-17 12:43:01: gest has been added to the (loom) gest.

2021-07-17 12:30:06: guide pretty much written. things work well enough. migrating codebase to its own git repo. hoping to make an initial gest page for the (loom) soon, containing the program and the guide.

2021-07-15 18:36:40: new milestone in (gest): removed debug printf statements :)

2021-07-15 09:22:36: just merged all my test gestures together into one medley in (gest) and it is quite satisfying.

2021-07-14 22:25:33: some initial work on temporal weights in (gest). It's starting to tap into some (howyousay) energy. I already got my example working with two interpretations on how to shape the tempo flucations.

2021-07-14 16:13:44: I'm thinking about writing a supplemental literate program consisting the program wrapped in a sndkit node, as well as a set of sample programs. This would all be published at the (loom) eventually.

2021-07-14 16:12:35: things are actually working now in (gest). woo! even the initial test examples I've made are inspiring.

2021-07-13 11:29:01: yeah, error accumulation is going to be something I'll need to focus on. I just turned on my naive solution and of course it doesn't work.

2021-07-13 11:27:09: lots of progress with my implementation. An initial ramptree with polyramps and targets with lineaer behavior works mostly. I'm running into some timing issues now, so there may need to be some better checking of time.

2021-07-09 20:18:54: hoping to spend this weekend working on (gest). there's a lot to think about. words and some code have been written already these past two days.

2021-06-26 18:09:51: wrote some initial words down for what will be the gest program. slowly figuring out the implementation details.

2021-06-24 09:25:31: gave the (gest) page an initial overview. it's a pretty good distillation of the concepts I have so far.

2021-05-29 11:46:12: I haven't figured out a convenient way to notate these gestures. Some sort of DSL probably. That's going to require some thought.

2021-05-29 11:44:51: these ramps would then be used as variables in sequencing generators for various interpolation methods to get from point A to point B.

2021-05-29 11:43:24: I have been basically imagining gestures being constructed out of a hierarchy of ramp signals generated from a phasor clock source. N number of phasor periods makes a single base ramp, which can be further subdivided into smaller ramps, which can be further subdivided, and so on.

2021-05-29 11:39:58: well, not arbitrary precision. it's floating point. but for rhythmic subdivision in music with room for microtimings, it's practically abritrary precision.

2021-05-29 11:38:12: phasors are pretty awesome because they contain continuous time data. It's pretty trivial to subdivide a phasor, and even scale it too. It's also very easy to turn phasors into clock signals with arbitrary precision.

2021-05-29 11:34:55: I was stuck for a while thinking how one could synchronize line generators with clock signals. Then I realized you could have a clock signal be a phasor instead of a tick signal like metro.

2021-05-29 11:31:59: it has been several months since I updated this page, but I've been thinking about this project off and on.


home | index