gesture sequencer updates, some code
I'm now able to sequence gestures using LIL, the scripting language included with sndkit.
The following LIL snippet uses two instances of Gest to control an FM oscillator. One controls pitch (via the function sequence), the other timbre (via the function modindex):
The meaningful thing to extract from this is that gestures are programmed using a set of low-level commands. These commands will create phrases that take up a fixed number of beats, populate these phrases with Ramp Trees, and cap the leaf nodes with targets with behaviors that determine the interpolation method used to go from one target to another.
Sets of simple low-level commands like the ones above are a programmers best friend, because they lend themselves well towards higher-level abstractions :)
It turns out you can plug in USB peripherals into the RG350 using a USB-C adapter (which I managed to have lying around). I started plugging various things in and checking dmesg to see what would show up.
Keyboards and mice show up no surprises. It doesn't look like my Griffin or Grid gets detected (also no surprises). USB MIDI seems to be a no-go too (slightly surprising). My XP-PEN drawing tablet *does* show up using generic HID drivers (surprising!)
C programming question
How cursed is it to treat a "char *" as a generic pointer? I'm running into a situation were I may need to do something like this:
struct some_struct *data;
data = allocate_this_somehow();
x = (char *)data;
Other than not using in it places where a string is expected, should I be aware of anything? Does it trigger crazy weird UB? Does the char pointer type do weird things that I should be aware of? etc. etc.
Just merged all the gesture tests I made for myself into one medley, and I have to say the results are quite satisfying.
Again, this is just one gesture sequencer controlling only the pitch of a single FM oscillator. This is all step-sequenced, no human recorded performance. Very surprised with how natural and fluid it feels.
And it's all externally clocked so it plays well with others!
of course, this is just one interpretation of how to phrase it.
Instead of slowing down to to the high note, one could speed up in anticipation before dramatically slowing down at the climax. More or less inverting the mass changes.
Sure this version sounds a little bit unnatural, and not my favorite, but with a bit of tweaking it is on it's way to being a valid interpretation.
This sort of thinking starts to get at the "hows" of computer-performed music and not the "whats", which is something I've been thinking deeply about:
Here's some temporal weight in action!
I'm tweaking the masses and inertias in some of the targets so that time compresses while reaching the high note, and expands on the following quarter note triplet.
Slightly exaggerated for dramatic effect!
The next thing I want to focus my attention on in my gesture sequencer is this idea of temporal weight. That is to say, the idea that every discrete target point in a gesture has the opportunity to influence the global tempo of the external conductor signal.
Every target can add or subtract temporal mass (how fast/slow the tempo is) as well as inertia (how quickly to react to changes in temporal mass).
This approach would allow a much more dynamic approach to tempo phrasing that I haven't seen before. Instead of having to draw some automation curve on a global tempo parameter for lyrical embellishments, I could just make the notes heavier.
This library implementing Pyin looks pretty nice, though I haven't tried it: https://github.com/xstreck1/LibPyin
If some kind soul familiar with licensing ins and outs could help me understand, I would appreciate it.
Can I include this library as a git submodule in my public domain project even though it's GPLv3?
Am I only allowed to use it if my project is also under GPLv3?
Maybe I can get permission for an exception from the authors?
The readme is very encouraging of its use, so hopefully the license won't be a major obstacle. 😕
Thanks for any insights. :)
Here it is with an FM oscillator instead of a sine wave. A bit more spectrum to play with.
Gesture sequencing continue!
After some bug fixing, polyramps can stack now!
This melodic phrase is a gesture that has eighth notes, quintuplet eighth notes, and quarter/eighth note triplets. It's on a loop, so I've programmed the last note to glissando back into the first note. This is possible because it's all a continuous audio-rate signal.
Keep in mind there is no predetermined grid here. All that's given is a phasor signal pulsing out beats like a conductor would (this is converted into a metronome heard in the recording). The sequencer is interpreting those beats and doing the subdivisions dynamically, just like a human performer would do.
gestures synthesizer updates
I added "step" behavior to make gestures sound more like a traditional step sequencer. Also makes it easier to debug.
I also implemented monoramp transformation.
So, if a polyramp takes a single 0-1 ramp and divides it into N equal steps, a monoramp takes N ramps and merges it into 1 ramp. From there, a polyramp could be applied to it, creating arbitrary divisions of rhythm.
This gesture example features two eighth notes, followed by a set of quarter-note triplets.
The quarter note triplets were made by creating a monoramp from 2 beats, then converting that monoramp into a polyramp of 3 beats.
I let the last note have linear behavior so it could dynamically gliss back into itself.
I found the issue. The good news is it was a mistake I made and not the fundamental problem with error accumulation I was dreading. That one can wait for another day.
Here's the gesture with a metronome attached to it. Note how this gesture is moving in time with the beat of the metronome.
In theory, I should be able to slow down the tempo and the gesture would automatically stay synchronized to it without having any prior knowledge about the tempo changes.
cards: tetrapolpo / artefatto_427
music : Mai Mai Mai - Upnos
I slowed down the external conductor signal, and it was gracious enough to expose some of the timing errors that would happen due to round off accumulation (was waiting for those).
Now I gotta think about what I'm gonna do about that...
Here's the gesture mapped to the frequency of sine wave.
Ah. much better.
I teach computers how to sing.
Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.