June

double densed uncounthest hour of allbleakest age with a bad of wind and a barrel of rain

double densed uncounthest hour of allbleakest age with a bad of wind and a barrel of rain is an in-progress piece for resonators and brass. I’m keeping a composition log here as I work on it.

There are sure to be many detours. Getting it in shape might involve:

Wednesday June 12th

I took astrid out for a test run with the a name for tomorrow fellers this weekend while I was in Milwaukee. It all actually kinda worked out! I have lots of small (and some large) tweaks I’d like to make.

In the engine, I’d like to expose the shared memory sampler to python instruments – probably via the ctx that gets passed to every callback. Then, I need to make sure play command params are still being passed in properly (I think I broke it a while back, but I should be able to reuse the parser for update messages) so I can use them to trigger the sampler… but I’ll need to play around with it in the instrument script a bit to know what feels right.

I was also craving some way to store and recall snapshots of parameter states and that turned out to be pretty straightforward to implement. Params are just stored with integer keys that correspond to an enum of all the params the instrument knows about, so storing the snapshot just loops over every param from 0 to N and (if the param exists in the LMDB session) writes the bytes of its value into a shared memory blob. Recall loops over the blob and writes the values back into the LMDB session.

Being able to store & recall the param state of the instrument(s) is pretty exciting. J and I were talking about the freedom that would come from being able to dial in to a nice place, snapshot it, and feel no anxiety about taking it somewhere totally far away since the previous (or any) state is always just a recall command away.

I don’t think it’s worth trying to finish before the session in Texas, but it would be nice also to eventually implement a sampler / recording feature for param changes over time – and internal commands, too. Being able to store and replay some gesture coming in from the external controller, or a sequence of commands on the console could be very useful.

I also fixed the last memory leaks! Feels great to watch memory get reclaimed while I play, I was a bit worried that would turn into a giant project but the problem ended up being exactly what I suspected: I just wasn’t munmapping some mmaped shared memory when sending buffers off to the mixer, so the calls to shm_unlink weren’t doing anything since the kernel thought they were still being actively used.

I’ve got another week here in Madison (I’m cat-sitting) to practice and tune the instrument scripts, then just under a week at home again to make any modifications to the hardware side of things before heading off to Texas…

I’m hoping Andrew has an acoustic guitar I can use as a resonator – that ended up working out well. I also kind of like the idea of not really fixing on one resonator, but trying out whatever’s around. Might grab some backup transducers and even see if I can fit a second amp in my bag when I’m home again…

Wednesday June 5th

I’m starting to count the days… I was hoping to be done with the plumbing-type instrument building by this weekend, and spend the next couple weeks before I go to Texas just practicing and developing the instrument script. I’m not too far off, but there are still some wildcards:

That said, thankfully I got MIDI control working again today after work already! This morning I was struggling to get the python rtmidi callback to behave inside of instrument scripts. It seemed like the simplest path to just adapt one of the many python implementations for MIDI handling I’d already done while I think about future adaptations. Callback messages were getting backed up somewhere in python, likely due to a threading problem. Python concurrency still confuses me. Maybe eventually I’ll spend enough time with cpython internals and the standard library source to understand the magic, but in the meantime I decided to try using the ALSA API for the first time to add MIDI support in C, and it turned out to be super easy! No mysteries, just added a new MIDI listener thread to astrid instruments and passed a pointer to the instrument struct into it. No crazy scoping issues or mysterious silences and throttled logging etc etc – it more or less just worked on the first try.

This also means I’m switching back to my bigger MIDI controller (the faderfox MX12 which I love – so many controls!) and that means I get to map waaaaay more params of the littlefield instruments to direct control. :-)

Tuesday June 4th

Almost something!

More of the pitch controls are wired up now, but I’m still finding my way into interfacing with them. In this recording the parameters of the littlefield instrument are being sequenced by littleseq, and I’m just toggling littleseq on and off and issuing a console command here & there.

One such command is mtrak (which is short for microphone pitch tracking, but chosen because it’s one letter off from Amtrak and I’m a dork) which toggles on a pitch tracker that follows mic input and maps the (slewed) frequency to half of the osc bank. I added the barebones port of librosa’s yin implementation to libpippi for just such an occasion a couple years ago so it’s fun to actually be using the thing with a realtime instrument finally!

Other observations:

Speaking of rabbits, the baby bunnies around here are already looking like teenagers. One of them hopped right up to me while I was working on astrid in the park this morning! Cute, lanky little survivors.

Sunday June 2nd

Oops, it’s June already!

A couple days ago I said:

[Sending params as strings] simplifies the daisy firmware concerns a bit, too. (Even tho it’s more annoying to work with strings than just memcpy some bytes into a field, that’s OK.)

Which made me feel sheepish today since I could not figure out what was going wrong with the daisy firmware when I adjusted it to send strings with printf encoded floats instead of writing the bytes of the float into a buffer… I’m not the only one who lost half a day to this, it seems! :-)

Anyway, after flailing around I started to wonder if printf had some special behavior for floats when running on an stm32. Floats aren’t always super well supported on microcontrollers… but in this case the reason for the different behavior in printf was just to keep the firmware blob sizes down, so it makes sense that the default configuration strips this support out. Seems like a good way to slim down most firmwares since it’s not a super common need I’d imagine. I ended up finding that post linked above which shared that updating the linker flags with LDFLAGS += -u _printf_float re enables printf float support!

It’s pretty exciting to have a few controls mapped out, running alongside the littleseq python instrument which is also sequencing the parameters of the littlefield C instrument. (Not the most original names, they’re named after the town in Texas where I plan to use them in an ensemble context for the first time.)

It’s fun to have a workable – how long has it been this time? – combination of command inputs, live coding, microphones and knobs to twiddle going again. Interacting with littleseq feels good, but I also need to figure out how to make good use of the realtime controls I have available via the daisy petal I’m using for that purpose. It has:

And of course audio inputs and outputs I don’t plan to use for this… though maybe some audio-reactive controls like piezo triggers would be cool to try to sort out if there’s time?

I’m coming around to the idea of trying to keep all the realtime controls to the microphone/exciter feedback pairs, and the various controls available on the daisy petal. I want to map every parameter to physical controls! There are so many parameters though… (LMDB is also still showing no signs at all of causing problems handling them in the audio thread!) and while I don’t really love live-coding in performance, I don’t really mind live-tweaking… so I think if I build up littleseq more so that I can essentially enable and disable features and groups of things easily, and tweak the algorithms for controlling them now and then… that opens up being able to work with modulating a lot more aspects of the sound in different configurations.

Control mapping and parametrization is always tough.


Log May 2024

Log April 2024

Log March 2024

Log February 2024

Log January 2024

Log December 2023