CES highlights from a tangible interface guy
I’m excited to see more interactive tables that support tracking of multiple tangible objects enter the mainstream. One of the highlights is the Lenovo Horizon 27, an all-in-one computer designed to lay flat on a table.
It comes with several objects that can be tracked on it’s surface. It looks like this tracking is capacitive, and uses a special electrode pattern on each object to identify orientation and position.
It looks like the electrodes are made out of conductive fabric
Also there was this LCD which uses an IR camera behind the LCD panel to track fingers, objects embedded with LEDs and glyphs. It runs about $20k per screen.
Did I mention there were robots?
Here are some experiments I did with Kinect. By selecting three separate points in space you can define a plane to use as an interaction surface. Kinect is much more flexible than other camera-based approaches for doing this in terms of lighting, camera alignment, etc.
I’m very excited to share some video from the Create a Chemical Reaction Exhibit at the Museum of Science and Industry. Thanks to everyone who helped make this project a reality!
Software: Recording audio and video in sync
It turns out that writing software to record audio and video in sync is not as easy as you might expect. I had to do this for a recent project, and below are some of the issues I encountered, with solutions. I was working on Mac OS X (Snow Leopard) using openFrameworks. I used the openFrameworks movieGrabberExample as a starting point, along with ofxQTVideoSaver to store the video. There are some useful posts on the forums here.
This is a quick test of a LIDAR system for interactive applications. We developed software to use a laser scanner to track people and objects moving across a surface. It can track an area of about 15 meters on a side at an update rate of 40 Hz. The system handles different lighting conditions very well, and any kind of display can be used with the sensor.