what is are the kinects/leap motions of nowadays (stuff for body/limb tracking)? do people still play around with kinects/leap motions and are there better/more open solutions these days? it's been awhile since i've worked with any motion tracking stuff (5+ yrs)...
@mrufrufin I think Kinects are still a thing. I don't know much about open source solutions
@Pulsaare @mrufrufin It's all still very much alive; the community just fragmented when Apple bought PrimeSense, maker of the Kinect 1 hardware. The dev tools from the Kinect 1 era are still freely (and legally) distributed and work on modern OSes--afaik thanks to continued maintenance of PrimeSense's OpenNI (the open-source depth map part) and making a prior agreement with Asus for NITE (the closed-source skeleton tracking part) to drive Asus' Xtion Kinect clones. The problem driving fragmentation is that no post-Apple arrival can make new Kinect 1 clones with skeleton tracking, so the next generation of depth cameras all did their own separate sdks
@Pulsaare @mrufrufin Present state of things, there's the Kinect 1 clone world (MS, Asus, and PrimeSense's first-party cameras), and a remarkable community effort in the Processing library SimpleOpenNI that can drive practically every camera from that era, on Mac/Win/Linux. If you have that older hardware, it would be my first choice to start with. There are also a lot of openFrameworks addons, although those tend to be specific to one combination of camera and os, so they require some research. There are some real gems in that category though, like an addon for the Asus Xtion and Raspberry Pi
@Pulsaare @mrufrufin then there's the sort of 1.5 generation, Orbbec and Occipital Structure cameras. They share the OpenNI depth map software with the first generation, but not the skeleton tracking. These two are portable, running on Android and iOS respectively, so there are some interesting specific applications people have made for them...but the communities are rather small, so there's not a lot of help to be had
@Pulsaare @mrufrufin Then there's the second generation--Intel's RealSense line and Microsoft's newer first-party Kinects. Both of these are well supported in Unity and Unreal, as well as Processing/oF. RealSense is the budget option; the depth maps are low quality but they work on Raspberry Pi, and the SDK is forward-compatible to Intel's newest third-gen lidar camera. (There's Apple's new lidar too, of course--but I'm expect that to be thoroughly locked down, so possibly interesting to run retail software on, but not as much for development.) Microsoft's Kinect 2 and Azure Kinect offerings are the current high end in terms of consumer depth camera quality, but mostly work only with Windows and Linux desktops (and, intriguingly, the Nvidia Jetson Nano). MS has always had the best skeleton tracking of the bunch imo.
@n1ckfg tyvm for the leads! skeleton tracking would prob be preferred. i never invested in hardware myself and i might be working with another person remotely who i don't think has hardware either (linux on my end and mac on their end). i worked with the original kinect on macs (2013/14 i think?) and i remember it being somewhat painful to get everything working but doable but that situation getting worse and worse over time...
@celesteh @Pulsaare @mrufrufin No skeleton tracking with that afaik. There's NITE for the Kinect 1 clones, and Microsoft's skeleton tracking for their first-party stuff. Nuitrack sells a general purpose skeleton tracking solution for almost every camera and platform, but they make you license it per device. Various mocap apps include their own tracking, but those are mostly non-realtime. A bunch of true open-source mocap solutions have cropped up recently, but these are either 2D (XY points only) or non-realtime. Apple I think now has a true 3D skeleton tracking system for their thing, but I'm not sure what they'll allow you to do with it
Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.