greenhouse
a creative coding toolkit for spatial interfaces

Getting started with a 3d-sensor
(Microsoft Kinect for Xbox or OpenNI-compliant sensor)

Greenhouse applications can take input from a Microsoft Kinect for Xbox 360 or OpenNI-compliant sensor. But they don’t actually take the input directly from the device; first the data is processed by small chain of programs we call the Hantenna Pipeline. (On Mac, we install this as a standalone application called hantenna-pipeline.app, found in /Applications/Greenhouse).

The Hantenna Pipeline sends the depth sensor data through a few stages of processing, the end result of which is reasonably reliable hand pose information. This is what your application actually uses.

Note: Microsoft Kinect for Windows is not supported

Mac OS X

Configure the Hantenna Pipeline

The Hantenna Pipeline can be configured to understand where your sensor is located and how it is oriented. Or you can leave the defaults in place.

The default configuration assumes that the device is sitting at the center of your display, pointing straight out. It also assumes that your display is at the default location of 1000mm up and 700mm in front of an arbitrary origin point in the room. See (A) in the diagram below. See the Spatial Considerations Tutorial for understanding your room setup.

sensor configurations

To change the configuration, open /etc/oblong/hantenna.protein in your favorite text editor. It should look like this:

!<tag:oblong.com,2009:slaw/protein>
descrips:
- hantenna-pipeline
- config
ingests:
  openni:
    { openni-flipx: 0,
      openni-flipy: 0,
      openni-position: "0,1000,-700",
      openni-angle: 0,
      openni-maxz: 65535
    }
  kinect:
    { kinect-flipx: 0,
      kinect-flipy: 0,
      kinect-position: "0,1000,-700",
      kinect-angle: 0
    }
  ghost:
    { ghost-gui: 1
    }

Note: You only need to worry about configuration if you move the sensor or the defaults aren’t working well for you. After changing the configuration, you’ll want to restart the Hantenna Pipeline.

If using a Microsoft Kinect for Xbox

You will be editing the “kinect” section of /etc/oblong/hantenna.protein

First, we want to allow users to interact with hantenna applications in a way that makes them feel they are gesturing directly at the application/screen and not the device itself. Therefore, we need to set the the location of your sensor in the room in millimeters, offset from your origin [See the Spatial Considerations Tutorial for understanding the room and your spatial setup]. It’s probably easiest to calculate this position based off your monitor’s location. Hint: You’ll want a ruler. These values will go in the “kinect-position” field in the following string format:

“[over x],[over y],[over z]”

Where if your camera is 30mm to left, at the bottom of a 15” screen and a little bit in front (similar to figure (D) above), and your screen is located at the default location of (0, 1000, -700) this value may read “-300,809.5,-630”.

Next, determine the approximate angle of the Kinect as it relates to its base. If the sensor is pointed up, the angle will be positive and if pointed downwards it will be negative. Enter this angle in the field “kinect-angle”, replacing the 0. See figure E) above. This will also move the Kinect tilt motor.

If your sensor is sitting upright as in figure (A) in the diagram above, then you are finished. However, if your sensor is bottom-mounted as in figure (C), you will want to change the 0’s next to “kinect-flipx” and “kinect-flipy” to 1’s. This will inform the Hantenna Pipeline to invert the image in both directions before detecting gestures.

Finally, if you would like the ghost gui (the popup window which shows the gesture detections) to not display when running the pipeline, change the 1 next to “ghost-gui” to 0.

If using an OpenNI-compliant device

You will be editing the “openni” section of /etc/oblong/hantenna.protein

First, we want to allow users to interact with hantenna applications in a way that makes them feel they are gesturing directly at the application/screen and not the device itself. Therefore, we need to set the the location of your sensor in the room in millimeters, offset from your origin [See the Spatial Considerations Tutorial for understanding the room and your spatial setup]. It’s probably easiest to calculate this position based off your monitor’s location. Hint: You’ll want a ruler. These values will go in the “openni-position” field in the following string format:

“[over x],[over y],[over z]”

Where if your camera is 30mm to left, at the bottom of a 15” screen and a little bit in front (similar to figure (D) above), and your screen is located at the default location of (0, 1000, -700) this value may read “-300,809.5,-630”.

Next, determine the approximate angle of the device as it relates to its base. If the sensor is pointed up, the angle will be positive and if pointed downwards it will be negative. Enter this angle in the field “openni-angle”, replacing the 0. See figure E) above.

If your sensor is sitting upright as in figure (A) in the diagram above, then you are finished. However, if your sensor is bottom-mounted as in figure (C), you will want to change the 0’s next to “openni-flipx” and “openni-flipy” to 1’s. This will inform the Hantenna Pipeline to invert the image in both directions before detecting gestures.

Optionally, you may lower the default value next to “openni-maxz” to help filter out background noise.

Finally, if you would like the ghost gui (the popup window which shows the gesture detections) to not display when running the pipeline, change the 1 next to “ghost-gui” to 0.

Run the Hantenna Pipeline

To run the Hantenna Pipeline, open up the Greenhouse folder located in your Applications folder. Then double-click the hantenna-pipeline application.

Greenhouse applications

A green console will appear along with a gui showing positive gesture matches (if you configured the pipeline to display this gui in the first step).

Hantenna and Ghost GUI

Run your gestural Greenhouse application

Let’s start with the radiology sample.

With Xcode:

Open the Xcode project file (ending in .xcodeproj) located in /opt/oblong/greenhouse/samples/radiology, and press the Play button.

With a Makefile:

Using a terminal, navigate to the radiology project directory, make and run:

$ cd /opt/oblong/greenhouse/radiology
$ make
$ ./radiology

Interact!

Make an open palm gesture toward the sensor, and then close the hand into a fist. With the fist clenched, move it slowly toward and away from the sensor. The application will show successive different slices of the body as your hand moves back and forward.

If you have trouble, come visit the Greenhouse SDK Google Group.

Stopping the hantenna pipeline

You can quit the hantenna-pipeline app like any other Mac app, or press the Cancel button in the hantenna-pipeline window, to stop all the pipeline processes.

If you stop the pipeline while your application is still running, nothing bad will happen – but no more sensor input for you. To get sensor input again, just restart hantenna pipeline.