algorithmic modeling for Rhino
We have been working on a method of interacting with the Grasshopper Canvas with intuitive, multi-touch gestures registered by the RGBD camera on the Microsoft Kinect. The Kinect depth resolution is enough to detect touch events on an arbitrary surface, so we've recreated a giant canvas using a projector, a mirror, and a pane of tinted glass. So far, we've managed to create the basic canvas navigation features through sending mouse and keyboard events to Grasshopper...these are just the first in a host of potential gestures...
Read more at:
http://lmnts.lmnarchitects.com/interaction/grasshopper-canvas-meet-...
Comment
Ernest-
Thanks for asking! It's not finished.
We fully intend to release the source Kinect touch project as soon as it's ready (perhaps a matter of weeks now). It's not implemented in Grasshopper, but rather a stand-alone Windows 7 application that will allow you to track touch events on surfaces with the Kinect. It's written in C and C++ and depends on the Microsoft Kinect SDK. Our implementation just sends gestures as events to Windows as mouse and keyboard events. In our demo, we tailored this to Grasshopper. We have put considerable work into the calibration part and we want it well documented with good examples before we release it. Grasshopper is only one possible application...but perhaps the coolest we could think of. (Everyone out there knows that multi-touch is great for resizing photos and navigating maps, but collaborate definition review is a very promising application - in our humble opinion). That said, that application is architected in such a way as to be hackable for any end use. We look forward to releasing it so that the the broader community can help improve it.
Hi there
I'm new to this kind of component programming for Grasshopper and for Grasshopper too. So I need to develop a component to request data from a Tracking Device over the time repeatedly. I was wondering whether you would share that Kinect linking Know How for the community, as a mean that we can do something like this.
Regards
Ernest
Taz-
Perceptive Pixel's system is awesome. We saw one at Siggraph a few years ago.
82" would be awesome, but the 27"-er would probably do:
http://www.perceptivepixel.com/27-lcd-multi-touch-display/
If anyone can get Perceptive Pixel to give a price quote, that would be amazing. Otherwise, we'll probably have to make-do with our $120 Kinect.
Thanks for the kind words Tuan and Nathan!
Matt: we do have a second projector hooked up, projecting on the nearby wall (we didn't have it turned on during the video). Yeah, it's definitely intended as a teaching tool and is though of as being most useful to meetings, coordination and collaboration.
We are working on code that handles touch-calibration and can load application profiles...so no, there's nothing preventing us from using it with Ableton, etc., except that we tend to focus on (mostly) architectural related projects...mostly :). Obviously, large-scale multi-touch is nothing new - PerceptivePixel's displays are awesome - but the Kinect lets you do this on a limited budget...plus you can spill coffee on a piece of glass without doing much damage.
Amazing Work, with another projector to show the code, could make a great teaching tool.
Would the code linking Kinect to GH be simple enough to edit to work with other software such as Ableton Live so it could be used as a DJ setup, replacing the GH canverse for the Ableton control panels.
Welcome to
Grasshopper
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
Added by Parametric House 0 Comments 0 Likes
© 2024 Created by Scott Davidson. Powered by
You need to be a member of Grasshopper to add comments!