NextStage turns Kinect sensor into a virtual production camera

We’re used to seeing applications in which Kinect for Windows tracks the movements of users—after all, that’s what the Kinect sensor is designed to do. But what if you picked up the Kinect sensor and moved it around, so that instead of the sensor tracking your movements, you could track the position and rotation of the sensor through three-dimensional space?

That’s exactly what filmmaker Sam Maliszewski has set out to do with NextStage, a Kinect for Windows application that effectively turns the Kinect for Xbox One sensor into a real-time virtual production camera. Maliszewski places retroreflective markers throughout the scene he intends to film, and then he physically moves the Kinect sensor around the set, using its onboard cameras to record the action while data from the reflective markers instantly and accurately tracks the sensor’s 3D position.

The resulting video footage can then be combined with virtual objects and sets, with no need for frame-by-frame processing. Moreover, since NextStage provides depth-based keying, it lets filmmakers separate live-action subjects from the background, and it allows them to place live-action actors or objects on a virtual set without using green-screen techniques. Alternatively, depth mattes created in NextStage can provide a high-quality “garbage” matte for green-screen overlays.

Maliszewski has developed two versions of NextStage, both currently in beta: NextStage Lite, a free download that captures the video by using the Kinect sensor’s color and depth cameras, and NextStage Pro, which enables filmmakers to sync the tracking data to an external camera and to export it to such applications as Blender and Maya.

The Kinect for Windows Team

Key links

Comments (0)

Skip to main content