Kinect for Windows announces new version of SDK coming March 18

Today at Engadget Expand, I announced that Kinect for Windows SDK 1.7 will be made available this coming Monday, March 18. This is our most significant update to the SDK since we released the first version a little over a year ago, and I can’t wait to see what businesses and developers do with the new features and enhancements.

On Monday, developers will be able to download the SDK, developer toolkit, and the new and improved Human Interface Guidelines (HIG) from our website. In the meantime, here’s a sneak peek:

Kinect Interactions give businesses and developers the tools to create intuitive, smooth, and polished applications that are ergonomic and intelligently based on the way people naturally move and gesture. The interactions include push-to-press buttons, grip-to-pan capabilities, and support for smart ways to accommodate multiple users and two-person interactions. These new tools are based on thousands of hours of research, development, and testing with a broad and diverse group of people. We wanted to save businesses and developers hours of development time while making it easier for them to create gesture-based experiences that are highly consistent from application to application and utterly simple for end users. With Kinect Interactions, businesses can more quickly develop customized, differentiated solutions that address important business needs and attract, engage, and delight their customers.

Kinect for Windows Interactions transform how people interact with computers in settings ranging from retail to education, training, and physical therapy.
Kinect for Windows Interactions transform how people interact with computers in
settings ranging from retail to education, training, and physical therapy.

Kinect Fusion is one of the most affordable tools available today for creating accurate 3-D renderings of people and objects. Kinect Fusion fuses together multiple snapshots from the Kinect for Windows sensor to create accurate, full, 3-D models. Developers can move a Kinect for Windows sensor around a person, object, or environment and “paint” a 3-D image of the person or thing in real time. These 3-D images can then be used to enhance countless real-world scenarios, including augmented reality, 3-D printing, interior and industrial design, and body scanning for things such as improved clothes shopping experiences and better-fitting orthotics. Kinect Fusion is something many of our partners have been asking for and we’re thrilled to be delivering it now.

Kinect Fusion enables developers to create accurate 3-D renderings in real time.Kinect Fusion enables developers to create accurate 3-D renderings in real time.

The updated SDK also includes an enhanced developer toolkit and additional developer resources, including:

  • OpenCV and MATLAB Samples to allow developers to build advanced Kinect-enabled applications while using industry standards.
  • Kinect for Windows Code Samples on CodePlex, marking the first time that select Kinect for Windows code samples will be available through an open-source resource, enabling existing and potential partners to explore and develop new scenarios.

Seeing is believing
We demonstrated Kinect Interactions and Kinect Fusion live, onstage at Engadget Expand. You can watch the webcast of those demos now—and then come back to download the latest SDK on March 18. It’s fully compatible with all previous commercial releases, so we encourage everyone to upgrade to the new version. There’s no reason not to!

As always, we are constantly evolving the technology and want to know what you think. And we love hearing about the solutions you’re developing with Kinect for Windows, so please join us at Facebook and Twitter.

The Kinect for Windows sensor, together with the SDK, can help you create engaging applications that take natural voice and gesture computing to the next level.

Bob Heddle, Director
Kinect for Windows

Key Links

Comments (12)
  1. Frank says:

    Can't wait to try it! OpenCV and MATLAB samples sounds wonderful. What about

  2. Hello Frank,

    We can’t wait to  hear how you like the new features!

    We don't have a sample in our Toolkit to demonstrate using PCL. However, if you are already familiar with C++, using Kinect SDK with PCL is straight forward. The depth frames are simple arrays and they can use the other C++ samples to show how to use Kinect APIs.

    We welcome you to post questions in the Kinect for Windows forums, where the developer community and engineers are actively monitoring conversations:

    Thank you!

  3. Edgar says:

    I can't wait to try it!

    I was waiting for this long time ago.

  4. woohoo.  Can't wait to try it as well 🙂

  5. It's March 18 and I am waiting... says:




    Oh, the possibilities with the 1.7 SDK 🙂

  6. Mark says:

    Sounds great!

    Will we developers be able to integrate the SDK with

    a different 3d camera such as Panasonic D-Imager ?

  7. Bernie says:

    Our fourth year project is using the Kinect for capturing changes in body volume and limb circumference, we knew this update was coming but just didn't know when as it would have made the process of capturing our human models alot easier instead of developing this type of system from scratch. Unfortunately it's now too late to include this in our project but we may use it if we can! Thanks for the release!

  8. Marlon says:

    It's sad can't find where to buy the Kinect for Windows here in the Philippines.

  9. Bernie,

    We do hope that you'll be able to integrate the new features into future projects! Enjoy and keep us posted on your projects!

  10. Mark,

    Please post your question to our technical forums for a response to your question about cameras:

    Thank you!

  11. yoshiboarder says:

    Hi. I was looking for NUI samples written in c++. but still now found. Would you know when is updated it ?

    It is only supported c# version.

    I really want get a NUI samples in c++.

    thank you!

  12. 18Signals says:


    Was wondering if there was any way we can Beta?


Comments are closed.

Skip to main content