From the same person that claimed the Adafruit bounty comes an example of using a Linux OpenSource driver that takes the information back off the computer. I doubt many of us have a spare laser setup sitting around to replicate his video but this is the first example I have seen using the Kinect and motion tracking that isn’t just data on the monitor. Instead of using the RGB(Video) feed like a normal camera, he uses the depth feed to isolate the box; this allows it to function the same no matter what lighting conditions including complete darkness as well as not having the output projected image messing up the input.

marcan42 says:
What I did was rig it to track contours on the depth image, and attempt to pick out a rectangular object.

Then, by using the detected location of the corners, I can apply it as a perspective transform to my laser projector. The end result is that the cardboard box I’m holding becomes a “virtual screen” that is tracked by the laser projection in real time and in perspective

Following the hobby craze for the Kinect on robotics projects, this is a promisingly lead due to it’s object tracking and live-time response. Otherwise go buy a laser projector and use this at your next party, with a few tweaks you should be able to project different images on different people and then the craziness begins.

Off the Monitors and Onto the Walls

Uncategorized |