I’ve been wanted to experiment with skeleton tracking with Microsoft Kinect in forever. Finally, that time has arrived. And, boy! There are many ways you can do this. I did a lot of research at the the beginning trying to understand how all the pieces fit together, which frameworks/SDK to use, etc. This post is more like a journal documenting my journey with Kinect. It is in no way didactic. Since I’m learning about this myself, feel free to comment or suggest a better way to do things.
Some what confused at first, I thought, I should start with a list of requirements. What I am looking for in a platform are:
- Open source
- Run on Mac (and maybe Windows too)
- Widely adopted (which also implies a large community and easy to get supports)
- Give access to low level APIs
- Lend itself to interactive wall projection
There are five elements you will need to do the development:
- development environment
- the programming language
- Kinect driver or library
- Driver/Library wrapper
- the Kinect itself
First let’s start with development environment i.e. a place where you actually do the coding.
Xcode is a development environment for creating apps for Mac or iOS devices. It is now free and available for download via Mac App Store; whereas before you have to register to be a part of Apple Developer Program. With the help of openFrameworks, you can do Kinect development with C++ in Xcode. The down side is that it only runs on Mac.
Processing is an open source IDE running on its own language also called Processing. It is a sketching software very fun and easy to learn. It runs on Mac, Windows, and Linux.
Visual Studio will get the job done for Windows users. You can either use the development tool with the official Kinect SDK or openFrameworks. Now is also a good time to point out that I’m primary focusing this post on Mac users.
If you are a .NET developer, an application developer, or a hardcore developer in general, you’d probably more familiar C++ or feel like it’s adoptable. You can program C++ Kinect application with openFrameworks in Xcode or Visual Studio. Or (I believe) you can use it to work with Kinect SDK directly in Visual Studio. This is as far as I’ll talk about Kinect SDK.
If you’re brand new to programming, I suggest Processing. Like I mentioned above it’s fun and easy to use. You can hit the ground running in no time. Its format reminds me of Java or ActionScript programming.
The libraries or drivers are actually the part that will let your code talk to Kinect at a high level. The two popular open source libraries are: OpenKinect and OpenNI. The extent of my knowledge about OpenKinect is limited. It is made by a group of people who are passionate about reverse-engineering the driver. OpenNI is developed by PrimseSense, which is the company behind Kinect depth sensor technology according to this post on StackOverflow. The post also gets into more details on comparing the two libraries, if you’re interested.
Both Mastering openFrameworks: Creative Coding Demystified and Making Things See are focusing on OpenNI. Plus, I get a sense that OpenNI’s community is a bit larger; therefore, easier to find help online. And that is the library I landed on.
Wrappers allows us to communicate with Kinect library at an even higher level, which make our lives easier. OpenNI wrappers are listed here. SimpleOpenNI is an OpenNI wrapper for Processing, while ofxOpenNI is the wrapper for openFrameworks. Both of these wrappers need to be downloaded and installed separately.
NOTE: openFrameworks libraries/wrappers are called addons. The openFrameworks master download comes with an addon called ofxKinect, which runs off of OpenKinect and OpenCV (an image processing library). It will provide you with the depth data but not at the skeleton level as far as I know.
There are a few Kinect models out there. Theo Watson wrote a very nice description on the device compatibility. Check it out here for the most recent update. In the mean time, I copied his note over on 02.06.2014.
NOTE: Xbox Kinect models > 1414 (1473, etc) or Kinect4Windows devices that have been plugged into an XBox will not work with ofxKinect on Mac OSX
If you have an Xbox Kinect model 1473+ or Kinect4Windows device and want to use it with ofxKinect on OSX, DO NOT plug it into an XBox! If you do, it will receive a firmware update which will cause it to freeze after about 20 secs with libfreenect/ofxKinect on OSX. Linux and Windows are fine. The fix is in the works.
In the meantime, we suggest you get the original Xbox Kinect model 1414 and keep those new toys away from an XBox …
So, after combing through multiple blogs and books, I landed on coding in C++ in openFrameworks with OpenNI library, and ofxOpenNI wrapper. I will follow up with part 2 on how to get started in Xcode.
Where to go from here?
I recommend getting a book. Depending on which programming language you are comfortable with. If it’s Processing, I suggest Making Things See. If it’s openFrameworks, go with Mastering openFrameworks: Creative Coding Demystified. There’re some Q&As on StackOverflow in case you need help with your code. Oh and stay tuned for my part 2 post.