Lighting talk about Cascade in few hours (15UTC) https://hybrid-livecode.pubpub.org/workshop2021
@ojack Thank you very much! Let me know your thougths if you play withit.
A few years ago i looked at examples and the source code of IanniX.
https://www.iannix.org/en/whatisiannix/
Before that i had seen a visualization with sound at the ZKM in Karlsruhe that fascinated me. It was the approach how the geometry triggers the sound patterns. I think this had been created with IanniX.
(1/2)
@kandid @raphael oooh I haven't tried IanniX, but love playing with ways that visuals trigger sound. I am very visual and it helps me understand sound better to be able to see it in different ways.
Some explorations in case they are interesting:
-- pixelsynth (additive synth based on drawing): https://ojack.xyz/PIXELSYNTH/
-- pixel colors to drum triggers: https://www.instagram.com/p/B1ccefNHO8Q/
-- pixel colors to arpeggiator sequence: https://www.instagram.com/p/B2BXywzHCxa/
-- spectral synth: https://www.instagram.com/p/B4VSiVTp5_D/
-- webcam "sequencer": https://www.youtube.com/watch?v=n6Vu7BnYdnw
In all of these the sound is generated based on pixel colors, and I manipulate the visuals live in order to change the sound.
I always work from the visuals. The sound usually comes too short in my animations. There are a few attempts from me in the direction of color to sound. This one for example:
http://digital-defect.org/en/post/color2sound-mixer/
There color areas are sampled in real time and the found color is sent to SuperCollider. Create an "ambient" sound from the image material.
But Iannix has a different approach. It's more geometry and movement to sound. It's closer to the approach that dancers trigger the sound.
@ojack @kandid Amazing demos, I am gathering stuff here https://gitlab.com/raphaelbastide/cascade/-/issues/54
@raphael this is incredible!
trying to fight my urge to start playing with it immediately vs. watch the rest of the talks.