Whenever I work on adding features to my midi-controller-to-osc PD patches, I ask myself why do I keep coming back to this puzzle game?
I love and hate PD (a little bit). Its so good at building stuff on the fly and is extremely useful live. But would some sort of interpreted language alternative be a better long term solution?
@supermedia_art i dream of a kind of hybrid visual/text interface for this. i have a beautiful interaction model crystalised in my mind but not enough skill to build it.
maybe if i make a high fidelity enough animatic someone else will build it.
@supermedia_art it’s a bit hard to explain with words since it’s so much about the flow of keyboard operation, a couple of gimmicks, and graphviz inspired syntax.
say you were to launch a blank canvas and typed
a -> b -> c
a -> d
a -> e
b -> g
it would automatically generate the patches with those names and connect them as you would expect.
wait there aren’t patches with those names!
then a sort of intellisense/autocomplete widget would help out with that.
once something is on the screen, you’d get a dual cursor on the screen. selecting a visual patch would create a text selection of all the relevant parts of the text. moving the cursor in the text moves the visual selection and the view to the relevant plce in the visual interface.
@supermedia_art the text itself wouldn’t be “plain” text exactly, but richly interactible in certain subtle ways.
holding down a meta key switches to a kind of command mode where the position of the cursor becomes less fixed and more fluid. text-like commands like search and replace can be issued from the command mode, the command is executed as you let go of the meta key
We are an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.