Real-time interaction with Kinect & Blender py-nodes

thumbnail-kinect

Recently Z25 (z25.org) and Dutch School of Arts (HKU) approached us to do a research on real-time interaction using a Kinect and Blender for interactive theater. The idea was that based on the data from the Kinect, a director or editor can influence the play being performed. For example if a performing artist raises its hand, a spotlight should appear. A director or editor should be able to influence this behaviour in real-time.

Choosing the appropiate Kinect library
As we are not displaying the kinect data, but only use this data to influence a projected animation, we are only interested in basic Skeleton data; just the x-y-z coordinates of the head, shoulders, arms and legs. For this experiment we limited the data to only the head and arms.

According to Microsoft, the Windows SDK is the only library that is supported and mature enough to develop software for the Kinect. But of course we went looking for open source solutions and found a few.
At the basis:

  • libfreenect : no skeleton data available yet.
  • GFreenect: just a wrapper around libfreenect
  • OpenNI: the most advanced of all libraries, complete skeleton data available. You only need to register your Kinect to acquire a free licence.
  • Skeltrack: only head, shoulders and arms coordinates available, but fully open source. If you mis something you can always add it yourself.

Middelware, for higher level NUI support:

  • NiTE (runs on top of OpenNI)
  • NiMate (runs on top of OpenNI)

Both require registration for licensing (As OpenNI requires it).

For our research Skeltrack seemed suitable. Also Skeltrack doesn’t require any licensing. Skeltrack requires quite some libraries which have to be added to Blender (compilation and linking). More info on Skeltrack can be found at https://github.com/joaquimrocha/Skeltrack .

Blender animation or BGE
At first the node system of the BGE seemed very useful as presentation UI. But given the scope, a game engine is a bit of overkill for this project. There is no game aspect here, the only interaction comes from the kinect. With Blender py-nodes there was a proper node based solution that could be extended to the animation system. So we went for that.

So how does it work?
The kinect data comes in as a constant stream of coordinates. These coordinates are made available to Python via RNA. We created a blend file and wrote a few custom py nodes like Kinect – arm up, sub, add and multiply. Then we created a custom property, named emission, that drives the light intensity of the lamp. In the editor a director can modify the setup; even during performance.

Comments are closed.