New audio piece: "Harmonic Protocol".
Stereo feedback loop 4secs long, with left and right blended with a rotation matrix
Inside the loop, compute RMS energy per semitone per channel via a bank of biquad bandpass filters (q = 17.310) from midi note 24 through 96, accumulated modulo 12.
Inside the loop, scale each individual semitone by the energy of the octave accumulation 7 semitones away (pick a direction).
Inside the loop, apply strong dynamic range compression to normalize level to ~1.
link to stroboscopic youtube video
However, trying the same trick with an #ogg #vorbis input fails miserably: frame rate drops from 40+ to 3 after some minutes of audio-file time :( I conjecture (based on #top and #strace logs) that it is seeking back to the start of the file and re-decoding the whole thing on every seek backwards a little bit. Unsustainable.
Workaround: #ffmpeg to decompress beforehand.
link to stroboscopic youtube video
Another Fridge (Chladni plate)
One channel of the stereo audio is analysed, with a 4096-point FFT per video frame. Then the top 5 sinusoidal peaks are extracted, using interpolation for sub-bin accuracy. These are fed into an algorithm I found online, adapted to use complex phasors for each wavelength instead of just a real sinusoid. I don't know if it is accurately representing the physics, but it looks interesting enough.
Implemented in #GNU #Octave using its #sparse #matrix eigensystem solver. I used a 5x5 kernel for the operator, based on the 3x3 Laplacian kernel convolved with itself, not 100% sure that this is the correct way to go about it but results look reasonable-ish.
A quick survey for people who came to algorave elephant https://forms.gle/8svdffdNgnm2FLZA9
> Algorave+friends return to Corsica Studios for a two room #solstice #party featuring: Lil Data (PC Music) // Heavy lifting (Pickled Discs) x Graham Dunning (Fractal Meat) // Miri Kat (Establishment) // Deerful // Hard On Yarn Sourdonk Communion (Hmurd x peb) // Class Compliant Audio Interfaces x Hellocatfood (Computer Club/Keysound) // Digital Selves // Mathr // xname // BITPRINT // Deep Vain // Hortense // Tsun Winston Yeung // +777000 // Coral Manton // Rumblesan + more TBA
the bot works by iteratively zooming in. given a view, it computes a bunch of randomly zoomed in pictures, and replaces the view by the zoom that had the best score.
currently it computes 16 zooms at each iteration, and does 10 iterations. ideally the score gradually increases, but that doesn't always happen.
the final iteration's hiscore is rendered bigger.
the main problem is devising a fitness function, I came up with this:
1. for each pixel in the fractal, compute the local box dimension of its neighbourhood. use the gray value of each pixel as its measure. use square neighbourhoods of radius 1,3,7,15,..., with simple linear regression to get the slope of the log(measure)/log(neighbourhoodsize) graph
2. compute a histogram of all these dimensions (I simply sorted the array). then take as fitness metric the difference between 25% and 75% through the array: this is typically the width of the central bulge in the histogram.
I came up with this after skimming "Multifractal-based Image Analysis with applications in Medical Imaging" master thesis by Ethel Nilsson http://www8.cs.umu.se/education/examina/Rapporter/EthelNilsson.pdf , viewing the dimension image in geeqie with histogram overlayed was interesting. also inspired by https://mrob.com/pub/muency/deltahausdorffdimension.html