@raphael this is incredible!
trying to fight my urge to start playing with it immediately vs. watch the rest of the talks.
A few years ago i looked at examples and the source code of IanniX.
Before that i had seen a visualization with sound at the ZKM in Karlsruhe that fascinated me. It was the approach how the geometry triggers the sound patterns. I think this had been created with IanniX.
Some explorations in case they are interesting:
-- pixelsynth (additive synth based on drawing): https://ojack.xyz/PIXELSYNTH/
-- pixel colors to drum triggers: https://www.instagram.com/p/B1ccefNHO8Q/
-- pixel colors to arpeggiator sequence: https://www.instagram.com/p/B2BXywzHCxa/
-- spectral synth: https://www.instagram.com/p/B4VSiVTp5_D/
-- webcam "sequencer": https://www.youtube.com/watch?v=n6Vu7BnYdnw
In all of these the sound is generated based on pixel colors, and I manipulate the visuals live in order to change the sound.
I always work from the visuals. The sound usually comes too short in my animations. There are a few attempts from me in the direction of color to sound. This one for example:
There color areas are sampled in real time and the found color is sent to SuperCollider. Create an "ambient" sound from the image material.
But Iannix has a different approach. It's more geometry and movement to sound. It's closer to the approach that dancers trigger the sound.
@ojack @kandid Amazing demos, I am gathering stuff here https://gitlab.com/raphaelbastide/cascade/-/issues/54
Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.