Coming soon: Kinect for Windows SDK beta

Greetings! In case you missed it, in a press release from MIX11 yesterday, the following news was shared:

Today at MIX, Microsoft detailed some of the features in the Kinect for Windows Beta SDK from Microsoft Research coming in the spring, including the following:

  • Robust Skeletal Tracking for high-performance capabilities that track the skeleton image of one or two persons moving within the Kinect field of view
  • Advanced Audio Capabilities, including four-element microphone array with sophisticated acoustic noise and echo cancellation for great audio; beam formation to identify the current sound source and integration with the Windows speech recognition API also included
  • XYZ depth camera for standard color camera stream access and depth data that indicates the distance of the object from the Kinect camera

Developers can sign up to be notified of the release at https://research.microsoft.com/kinectsdk.

Here’s some more info from the Microsoft Research page:

With this SDK, you’ll be able to take advantage of:

  • The latest advances in audio processing, which include a four-element microphone array with sophisticated acoustic noise and echo cancellation for crystal clear audio.
  • Sound source localization for beamforming, which enables the determination of a sound’s spatial location, enhancing reliability when integrated with the Microsoft speech recognition API.
  • Depth data, which provides the distance of an object from the Kinect camera, as well as the raw audio and image data, which together open up opportunities for creating richer natural user interface experiences.
  • Highly performant and robust skeletal tracking capabilities for determining the body positions of one or two persons moving within the Kinect field of view.
  • Documentation for the APIs and a description of the SDK architecture.
  • Sample code that demonstrates how to use the functionality in the SDK.

This SDK is intended for non-commercial use to enable experimentation in the world of natural user interface experiences, with new state-of-the-art features planned for future releases that will continue to provide new ways to experiment.