algorithmic modeling for Rhino
We have finally cleaned up our code, documented, and commented enough to release our depth-as-touch interaction with the Microsoft Kinect into the public domain as free, opensource, copyleft, GPL-licensed code. As our demo interaction was built around our favorite software (Grasshopper), we thought it best to announce it here. The source is written in C/C++ linked against Microsoft's Kinect for PC SDK. We've provided the Visual Studio solution files, our code, but you will have to link it against Microsoft's SDK if you want to make changes. We have, however, released builds of the SurfaceCalibration, ExtentsCalibration and MissionMode executables...so if you have a Kinect, a projector, Grasshopper, and a finger (it helps to have between 2 and 10!) you can test it out right away.
The source and builds can be downloaded here...
http://lmnts.lmnarchitects.com/interaction/grasshopper-canvas-with-...
The code walkthrough is here...
http://lmnts.lmnarchitects.com/interaction/grasshopper-canvas-with-...
Happy Holidays and New Year!
Comment
Thanks Andy!
It would be great to see RGBD sensors get cheaper and more modular...that's likely in the nearish term, right? We can imagine plugging a little RGBD sensor into an Arduino, grabbing the depth buffer, doing the CV and particle analysis, and then passing events to the OS or application. Might even fit in nicely to Firefly that way (?). The more accessible, the better. At the moment, we'd be worried about speed...that's the main reason we implemented this as compiled code.
Very nice work. Congrats.
Welcome to
Grasshopper
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
© 2024 Created by Scott Davidson. Powered by
You need to be a member of Grasshopper to add comments!