when your gestural livecoding language slowly actually starts to somewhat work...
I'll perform with it at https://live-interfaces.github.io/liveinterfaces2020/ on Wednesday in Trondheim! (so still heaps of time to improve further)
@kf Yes, using the Dynamical Time Warping (with null rejection) in the Gesture Recognition Toolkit of Nick Gillian. Using two 9 d.o.f. sensors on the wrists as input; and some manual filtering still to confirm the detection.
@nescivi so basically you’re mapping a detected gesture directly to a live coding command? If so, how many are you able to work with at the time?
@kf basically it's now a contest between how many gestures the computer can keep apart and how many I can remember.
Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.