Follow

when your gestural livecoding language slowly actually starts to somewhat work...

· · Web · 2 · 0 · 3

I'll perform with it at live-interfaces.github.io/live on Wednesday in Trondheim! (so still heaps of time to improve further)

@kf Yes, using the Dynamical Time Warping (with null rejection) in the Gesture Recognition Toolkit of Nick Gillian. Using two 9 d.o.f. sensors on the wrists as input; and some manual filtering still to confirm the detection.

@nescivi so basically you’re mapping a detected gesture directly to a live coding command? If so, how many are you able to work with at the time?

@kf how many gestures in the databank? this rehearsal I got up to 14 different classes of gestures.

@kf basically it's now a contest between how many gestures the computer can keep apart and how many I can remember.

Sign in to participate in the conversation
post.lurk.org

Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.