Kinect Toolbox 1.1 : Template based posture detector and Voice Commander


In a previous article I introduced the Kinect Toolbox : http://blogs.msdn.com/b/eternalcoding/archive/2011/07/04/gestures-and-tools-for-kinect.aspx.

image

Kinect Toolbox v1.1 is now out and this new version adds support for some cool features:

  • Templated posture detector
  • Voice Commander
  • NuGet package

You can find the toolbox here : http://kinecttoolbox.codeplex.com or you can grad it using NuGet : http://nuget.org/List/Packages/KinectToolbox

Templated posture detector

Using the same algorithm as TemplatedGestureDetector, you can now use a learning machine and a matching system to detect postures. In the sample attached with the toolbox I detect the ”T” posture (i.e. when you body is like the T letter):

image

To do that, I developed a new class : TemplatedPostureDetector which uses an internal learning machine (like the gesture detector) :

public class TemplatedPostureDetector : PostureDetector
{
    const float Epsilon = 0.02f;
    const float MinimalScore = 0.95f;
    const float MinimalSize = 0.1f;
    readonly LearningMachine learningMachine;
    readonly string postureName;

    public LearningMachine LearningMachine
    {
        get { return learningMachine; }
    }

    public TemplatedPostureDetector(string postureName, Stream kbStream) : base(4)
    {
        this.postureName = postureName;
        learningMachine = new LearningMachine(kbStream);
    }

    public override void TrackPostures(ReplaySkeletonData skeleton)
    {
        if (LearningMachine.Match(skeleton.Joints.ToListOfVector2(), Epsilon, MinimalScore, MinimalSize))
            RaisePostureDetected(postureName);
    }

    public void AddTemplate(ReplaySkeletonData skeleton)
    {
        RecordedPath recordedPath = new RecordedPath(skeleton.Joints.Count);

        recordedPath.Points.AddRange(skeleton.Joints.ToListOfVector2());

        LearningMachine.AddPath(recordedPath);
    }

    public void SaveState(Stream kbStream)
    {
        LearningMachine.Persist(kbStream);
    }
}

To use this class, we only need to instantiate it and give it some templates (using the [Capture T] button or using a previously saved file). After that, the class can track postures for each skeleton it receives:

Stream recordStream = File.Open(letterT_KBPath, FileMode.OpenOrCreate);
templatePostureDetector = new TemplatedPostureDetector("T", recordStream);
templatePostureDetector.PostureDetected += templatePostureDetector_PostureDetected;

templatePostureDetector.TrackPostures(skeleton);

void templatePostureDetector_PostureDetected(string posture)
{
    MessageBox.Show("Give me a……." + posture);
}

Voice Commander

One thing worth noting when you develop with Kinect is that you will spend your time getting up and sitting down Sourire. In the previous article, I introduced the replay system which is very useful to record a Kinect session.

But when you are alone, even the recording is painful because you cannot be at the same time in front of the sensor and in front of your keyboard to start/stop the record.

So here enters the Voice Commander (tadam!!). This class can use a list of words and raise an event when it detect one of them (using the microphone array of the sensor). So for example, you can use “record” and “stop” orders to launch and stop the recording session while you stay in front of the sensor!

The code is really simple (thanks to Kinect for Windows SDK and Microsoft Speech Platform SDK):

public class VoiceCommander
{
    const string RecognizerId = "SR_MS_en-US_Kinect_10.0";
    Thread workingThread;
    readonly Choices choices;
    bool isRunning;

    public event Action<string> OrderDetected;

    public VoiceCommander(params string[] orders)
    {
        choices = new Choices();
        choices.Add(orders);
    }

    public void Start()
    {
        workingThread = new Thread(Record);
        workingThread.IsBackground = true;
        workingThread.SetApartmentState(ApartmentState.MTA);
        workingThread.Start();  
    }

    void Record()
    {
        using (KinectAudioSource source = new KinectAudioSource
        {
            FeatureMode = true,
            AutomaticGainControl = false,
            SystemMode = SystemMode.OptibeamArrayOnly
        })
        {
            RecognizerInfo recognizerInfo = SpeechRecognitionEngine.InstalledRecognizers().Where(r => r.Id == RecognizerId).FirstOrDefault();

            if (recognizerInfo == null)
                return;

            SpeechRecognitionEngine speechRecognitionEngine = new SpeechRecognitionEngine(recognizerInfo.Id);

            var gb = new GrammarBuilder {Culture = recognizerInfo.Culture};
            gb.Append(choices);

            var grammar = new Grammar(gb);

            speechRecognitionEngine.LoadGrammar(grammar);
            using (Stream sourceStream = source.Start())
            {
                speechRecognitionEngine.SetInputToAudioStream(sourceStream, new SpeechAudioFormatInfo(EncodingFormat.Pcm, 16000, 16, 1, 32000, 2, null));

                isRunning = true;
                while (isRunning)
                {
                    RecognitionResult result = speechRecognitionEngine.Recognize();

                    if (result != null && OrderDetected != null && result.Confidence > 0.7)
                        OrderDetected(result.Text);
                }
            }
        }
    }

    public void Stop()
    {
        isRunning = false;
    }
}

Using this class is really simple:

voiceCommander = new VoiceCommander("record", "stop");
voiceCommander.OrderDetected += voiceCommander_OrderDetected;

voiceCommander.Start();

void voiceCommander_OrderDetected(string order)
{
    Dispatcher.Invoke(new Action(() =>
    {
        if (audioControl.IsChecked == false)
            return;

        switch (order)
        {
            case "record":
                DirectRecord(Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Desktop), "kinectRecord" + Guid.NewGuid() + ".replay"));
                break;
            case "stop":
                StopRecord();
                break;
        }
    }));
}

Conclusion

With Kinect Toolbox 1.1, you have a set of tools to help you develop fun and powerful applications with Kinect for Windows SDK!

Comments (46)

  1. Anatoly says:

    Hi David,

    I have issue when I run Kinect Toolkit 1.1 :

    If I run it in 64 mashine – I get :

       Could not load file or assembly 'INuiInstanceHelper, Version=1.0.0.10, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its   dependencies. An attempt was made to load a program with an incorrect format.

    If I run it in 86 mashine – I get :

       Speech Recognition is not available on this system. SAPI and Speech Recognition engines cannot be found.

    Could you advise smth ?

    Thank you

    Regards, Anatoly

  2. @Anatoly: I think it is installation issues. For your point 1, you may have not installed Kinect SDK 64 bits. For your x86 machine, you may have not installed SAPI and SR on the machine

  3. Mike says:

    I have the same problem as Anatoly on my 64-bit machine. I have tried everything–uninstalling and reinstalling the SDK, removing INuiInstanceHelper from the GAC and adding it again–everything. Nothing works. Please help!

  4. Hi Mike,can you double check that:

    - You installed the last version of Microsoft Speech Platform SDK

    - You installed the last 64bits version of Kinect SDK

    Can you try on a 32 bits system too?

  5. Mike says:

    Note that I am using the newest version of the 64-bit SDK, version 1.00.12, from July 29th 2011.

  6. Mike says:

    Hi, thanks for the quick response! I actually removed all Speech-Related code from the projects, because according to the programming guide (research.microsoft.com/…/ProgrammingGuide_KinectSDK.docx), the Speech components are x86 only. Unfortunately I don't have a 32-bit system I can try it out on… :/

  7. You can try to compile the project using x86 target.

  8. Mike says:

    I am using the newest Kinect 64-bit SDK, version 1.00.12 (29 July 2011).

  9. Mike says:

    I already tried that. Same error… :/

  10. Mike says:

    I just noticed one difference in the error, actually: it says "PublicKeyToken=null" instead of "PublicKeyToken=31bf3856ad364e35", and the message "An attempt was made to load a program with an incorrect format." is no longer present.

  11. Mike says:

    I managed to find a 32-bit computer to test it on, and now the sample app doesn't give me that error on startup. However, it still gives me the same problem I had on the 64-bit machine, that it crashes when I click Capture Circle and then Stop Recording…

  12. Can you debug to see the exception you get whe you click on Capture Circle?

  13. Mike says:

    OK, I just installed VS Express 2010, as well as all of the Speech components and the 32-bit SDK, on the 32-bit machine. Now when I run the sample app I get the message about INuiInstanceHelper again, and upon clicking on "Capture Circle" I get the following error:

    System.NullReferenceException was unhandled

     Message=Der Objektverweis wurde nicht auf eine Objektinstanz festgelegt.

     Source=GesturesViewer

     StackTrace:

          bei GesturesViewer.MainWindow.recordCircle_Click(Object sender, RoutedEventArgs e) in C:UserskleinDownloadsKinect.ToolkitKinect.ToolkitGesturesViewerMainWindow.Gestures.cs:Zeile 23.

          bei System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs)

          bei System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)

          bei System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)

          bei System.Windows.UIElement.RaiseEvent(RoutedEventArgs e)

          bei System.Windows.Controls.Primitives.ButtonBase.OnClick()

          bei System.Windows.Controls.Button.OnClick()

          bei System.Windows.Controls.Primitives.ButtonBase.OnMouseLeftButtonUp(MouseButtonEventArgs e)

          bei System.Windows.UIElement.OnMouseLeftButtonUpThunk(Object sender, MouseButtonEventArgs e)

          bei System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget)

          bei System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)

          bei System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs)

          bei System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)

          bei System.Windows.UIElement.ReRaiseEventAs(DependencyObject sender, RoutedEventArgs args, RoutedEvent newEvent)

          bei System.Windows.UIElement.OnMouseUpThunk(Object sender, MouseButtonEventArgs e)

          bei System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget)

          bei System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)

          bei System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs)

          bei System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)

          bei System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)

          bei System.Windows.UIElement.RaiseTrustedEvent(RoutedEventArgs args)

          bei System.Windows.UIElement.RaiseEvent(RoutedEventArgs args, Boolean trusted)

          bei System.Windows.Input.InputManager.ProcessStagingArea()

          bei System.Windows.Input.InputManager.ProcessInput(InputEventArgs input)

          bei System.Windows.Input.InputProviderSite.ReportInput(InputReport inputReport)

          bei System.Windows.Interop.HwndMouseInputProvider.ReportInput(IntPtr hwnd, InputMode mode, Int32 timestamp, RawMouseActions actions, Int32 x, Int32 y, Int32 wheel)

          bei System.Windows.Interop.HwndMouseInputProvider.FilterMessage(IntPtr hwnd, WindowMessage msg, IntPtr wParam, IntPtr lParam, Boolean& handled)

          bei System.Windows.Interop.HwndSource.InputFilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)

          bei MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)

          bei MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o)

          bei System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)

          bei MS.Internal.Threading.ExceptionFilterHelper.TryCatchWhen(Object source, Delegate method, Object args, Int32 numArgs, Delegate catchHandler)

          bei System.Windows.Threading.Dispatcher.InvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs)

          bei MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam)

          bei MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg)

          bei System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame)

          bei System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame)

          bei System.Windows.Threading.Dispatcher.Run()

          bei System.Windows.Application.RunDispatcher(Object ignore)

          bei System.Windows.Application.RunInternal(Window window)

          bei System.Windows.Application.Run(Window window)

          bei System.Windows.Application.Run()

          bei GesturesViewer.App.Main() in C:UserskleinDownloadsKinect.ToolkitKinect.ToolkitGesturesViewerobjx86DebugApp.g.cs:Zeile 0.

          bei System.AppDomain._nExecuteAssembly(RuntimeAssembly assembly, String[] args)

          bei System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args)

          bei Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()

          bei System.Threading.ThreadHelper.ThreadStart_Context(Object state)

          bei System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean ignoreSyncCtx)

          bei System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)

          bei System.Threading.ThreadHelper.ThreadStart()

     InnerException:

  14. The error on Circle_Click is normal as the toolbox failed to initialize all.

    I had just tried on my x64 computer and everything worked well :(

    I only had to set the target to x86

  15. Can you tell me on which line you get the first error?

  16. Mike says:

    It's line 23 in MainWindow.Gestures.cs. I seem to recall someone having issues related to DirectX. I'll try (re)installing that on this machine next…

  17. Is this the first error you get?

  18. Mike says:

    Other than the error about INuiInstanceHelper (which is just a pop-up without a line number), it's the first error.

  19. Mike says:

    I just tried installing the DirectX SDK and runtime, as well as all Windows Updates on the 32-bit computer… No dice. :(

  20. Mike says:

    I believe I've discovered the solution to the INuiInstanceHelper problem… I just deleted the files in the Data directory and everything's working now… I guess it makes sense that it was looking for an old version of the assemblies when it tried to restore the serialized data…

    However, I still get a crash now when I click "Stop Recording", after having successfully started by clicking "Capture Circle":

    System.ArgumentOutOfRangeException was unhandled

     Message=Der Index lag außerhalb des Bereichs. Er muss nicht negativ und kleiner als die Auflistung sein.

    Parametername: index

     Source=mscorlib

     ParamName=index

     StackTrace:

          bei System.ThrowHelper.ThrowArgumentOutOfRangeException()

          bei System.Collections.Generic.List`1.get_Item(Int32 index)

          bei Kinect.Toolbox.Gestures.Learning_Machine.GoldenSection.ProjectListToDefinedCount(List`1 positions, Int32 n) in C:UserskleinDownloadsKinectkinecttoolbox-91786Learning MachineGoldenSection.cs:Zeile 52.

          bei Kinect.Toolbox.Gestures.Learning_Machine.GoldenSection.Pack(List`1 positions, Int32 samplesCount) in C:UserskleinDownloadsKinectkinecttoolbox-91786Learning MachineGoldenSection.cs:Zeile 109.

          bei Kinect.Toolbox.RecordedPath.CloseAndPrepare() in C:UserskleinDownloadsKinectkinecttoolbox-91786Learning MachineRecordedPath.cs:Zeile 78.

          bei Kinect.Toolbox.LearningMachine.AddPath(RecordedPath path) in C:UserskleinDownloadsKinectkinecttoolbox-91786Learning MachineLearningMachine.cs:Zeile 46.

          bei Kinect.Toolbox.TemplatedGestureDetector.EndRecordTemplate() in C:UserskleinDownloadsKinectkinecttoolbox-91786GesturesTemplatedGestureDetector.cs:Zeile 56.

          bei GesturesViewer.MainWindow.recordCircle_Click(Object sender, RoutedEventArgs e) in C:UserskleinDownloadsKinectkinecttoolbox-91786GesturesViewerMainWindow.Gestures.cs:Zeile 25.

          bei System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs)

          bei System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)

          bei System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)

          bei System.Windows.UIElement.RaiseEvent(RoutedEventArgs e)

          bei System.Windows.Controls.Primitives.ButtonBase.OnClick()

          bei System.Windows.Controls.Button.OnClick()

          bei System.Windows.Controls.Primitives.ButtonBase.OnMouseLeftButtonUp(MouseButtonEventArgs e)

          bei System.Windows.UIElement.OnMouseLeftButtonUpThunk(Object sender, MouseButtonEventArgs e)

          bei System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget)

          bei System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)

          bei System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs)

          bei System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)

          bei System.Windows.UIElement.ReRaiseEventAs(DependencyObject sender, RoutedEventArgs args, RoutedEvent newEvent)

          bei System.Windows.UIElement.OnMouseUpThunk(Object sender, MouseButtonEventArgs e)

          bei System.Windows.Input.MouseButtonEventArgs.InvokeEventHandler(Delegate genericHandler, Object genericTarget)

          bei System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)

          bei System.Windows.RoutedEventHandlerInfo.InvokeHandler(Object target, RoutedEventArgs routedEventArgs)

          bei System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)

          bei System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)

          bei System.Windows.UIElement.RaiseTrustedEvent(RoutedEventArgs args)

          bei System.Windows.UIElement.RaiseEvent(RoutedEventArgs args, Boolean trusted)

          bei System.Windows.Input.InputManager.ProcessStagingArea()

          bei System.Windows.Input.InputManager.ProcessInput(InputEventArgs input)

          bei System.Windows.Input.InputProviderSite.ReportInput(InputReport inputReport)

          bei System.Windows.Interop.HwndMouseInputProvider.ReportInput(IntPtr hwnd, InputMode mode, Int32 timestamp, RawMouseActions actions, Int32 x, Int32 y, Int32 wheel)

          bei System.Windows.Interop.HwndMouseInputProvider.FilterMessage(IntPtr hwnd, WindowMessage msg, IntPtr wParam, IntPtr lParam, Boolean& handled)

          bei System.Windows.Interop.HwndSource.InputFilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)

          bei MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)

          bei MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o)

          bei System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)

          bei MS.Internal.Threading.ExceptionFilterHelper.TryCatchWhen(Object source, Delegate method, Object args, Int32 numArgs, Delegate catchHandler)

          bei System.Windows.Threading.Dispatcher.InvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs)

          bei MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam)

          bei MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg)

          bei System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame)

          bei System.Windows.Threading.Dispatcher.PushFrame(DispatcherFrame frame)

          bei System.Windows.Application.RunDispatcher(Object ignore)

          bei System.Windows.Application.RunInternal(Window window)

          bei System.Windows.Application.Run(Window window)

          bei System.Windows.Application.Run()

          bei GesturesViewer.App.Main() in C:UserskleinDownloadsKinectkinecttoolbox-91786GesturesViewerobjx86DebugApp.g.cs:Zeile 0.

          bei System.AppDomain._nExecuteAssembly(RuntimeAssembly assembly, String[] args)

          bei System.AppDomain.ExecuteAssembly(String assemblyFile, Evidence assemblySecurity, String[] args)

          bei Microsoft.VisualStudio.HostingProcess.HostProc.RunUsersAssembly()

          bei System.Threading.ThreadHelper.ThreadStart_Context(Object state)

          bei System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state, Boolean ignoreSyncCtx)

          bei System.Threading.ExecutionContext.Run(ExecutionContext executionContext, ContextCallback callback, Object state)

          bei System.Threading.ThreadHelper.ThreadStart()

     InnerException:

  21. David Barbey says:

    For your info, even though I had installed all pre-requisite software I was receiving error messages when lunching your GestureViewer app. The error I was getting is the following: "Unable to find assembly 'Kinect.Toolkit, Version=1.0.1.0 …", the application window would still launch but I would then get nullReferenceExceptions when starting to move in front of the kinect.

    I checked the AssemblyInfo.cs of the Kinect.Toolbox project and there I found the following reference: assembly: AssemblyVersion("1.1.0.1") which was different from the error message I got. I then thought that it was most likely coming from something being obviously created in an earlier version of the assembly and by reading at Mike's post above, I got an idea and decided to remove all Speech-Related code from the projects and that solved my problem.

    Hope this can help others.

    By the way. Thanks David C. for the great work !

  22. David Barbey says:

    sorry.. correction.. I did not removed all Speech-Related code.. but the recording files (.save) in the various "data" folders.

  23. Yes I think some old references lie in the .save files:)

  24. David Barbey says:

    Hi David,

    I looked around but couldn't find the answer to the following question: Is it possible to use the Microsoft Speech Platform SDK in French ? Meaning I would like to talk to my kinect in French. Is it possible ? I googled SR_MS_fr-FR_Kinect_10.0 without success.

    Thanks for your time.

    Regards, David

  25. irfan says:

    Hi

    I have download this Kinect toolbox and will be using it wit my Kinect SDK for windows beta2

    im trying to familiarize with the codes, i have downloaded the sources but where could i get the samples?

    I need to see some samples to learn

    Please help

    Thanks

    Irfan

  26. On the kinecttoolbox.codeplex.com site you can download a sample called Gestures Viewer available in the samples link:

    kinecttoolbox.codeplex.com/…/76297

  27. Xavier says:

    Hi David,

    I am trying to implement new gestures using two hands. You said in the previous article that "you just have to plot the positions of each hand in the template."

    Could you tell me how you would do that more precisely please?

    Thank you

  28. You can instanciate 2 TemplateGestureDetectors and synchronize them to detect on combined gestures.

  29. HI david,

    I actually dont understand what to do?

    Here is what i have done, I have build a software for the kinect, when the program starts, the mousecursor of the dekstop is in my control.

    But what i want tto do is, implement ur "T" posture, so when the T is recognized the left hand should be able to utilse the program. Basically that is done. What i dont understand is how to build ur code with my code?

        private void Window_Loaded(object sender, RoutedEventArgs e)

           {

               try

               {

                   nui.Initialize(RuntimeOptions.UseSkeletalTracking);

               }

               catch (Exception ex)

               {

                   MessageBox.Show("Could not initialize Kinect device: " + ex.Message);

               }

               #region Transformsmooth

                        nui.SkeletonEngine.TransformSmooth = true;

                        var parameters = new TransformSmoothParameters

                       {

                                 Smoothing = 0.10f,

                                 Correction = 0.90f,

                                 Prediction = 0.10f,

                                 JitterRadius = 1.00f,

                                 MaxDeviationRadius = 0.5f

                        };

                       nui.SkeletonEngine.SmoothParameters = parameters;

               #endregion

               // event to receive data

               nui.SkeletonFrameReady += new EventHandler<SkeletonFrameReadyEventArgs>(nui_SkeletonFrameReady);

           }

           void nui_SkeletonFrameReady(object sender, SkeletonFrameReadyEventArgs e)

           {

               SkeletonFrame allSkeletons = e.SkeletonFrame;

               foreach (SkeletonData s in allSkeletons.Skeletons)

               {

                   if (s.TrackingState == SkeletonTrackingState.Tracked)

                   {

                       foreach (Joint joint in s.Joints)

                       {

                           Joint scaledHandRight = s.Joints[JointID.HandRight].ScaleTo((int)SystemParameters.PrimaryScreenWidth, (int)SystemParameters.PrimaryScreenHeight, SkeletonMaxX, SkeletonMaxY);

                           Joint scaledHandLeft = s.Joints[JointID.HandLeft].ScaleTo((int)SystemParameters.PrimaryScreenWidth, (int)SystemParameters.PrimaryScreenHeight, SkeletonMaxX, SkeletonMaxY);

                           int xCoordinate = (int)scaledHandRight.Position.X;

                           int yCoordinate = (int)scaledHandRight.Position.Y;

                           bool clicked;

                          MouseHook.MoveMouse(new Point(scaledHandRight.Position.X, scaledHandRight.Position.Y));

                       }

                   }

               }

           }

    I hope u can help me out

    thx

  30. I don't understand your problem

    Have you try to use the gesture viewer sample? (in the codeplex site)

  31. When i start my application,  the skeletal tracker tracks my left and right hand, automatically the left hand can move the mouse cursor.

    I would like to begin the program like this:

            The kinect sees the persons infront of the camera, but it selects the right hand of the person, that is Waving or making the "T" posture.

             So after the kinect sees the person who is waving in this example, can move the mouse cursor and can navigate in the application.

    I have tried the gestureviewer, but i dont understand how to build this.

  32. It is a standard project for visual studio 2010

    Do you have an error message during compilation?

  33. No i dont have an error during compiliation!

    But do u understand what i mean?

    if u could guideline me to do this, it would be great.

    I am a student doing my internship, the project is almost finished,

    only these things i have to import.

  34. I'm so sorry but I don't understand what you need :(

  35. I need the manner, where i can be the only one who is being tracked and move the cursor of the PC.

    this means, if there are 3 people in front of the Kinect, the kinect should only select the person who makes a Gesture like waving to the kinect.

    When this person waves he can control the pc by his hand, the control part i allready have builed in, i just want

    to add a gesture, so the kinect can select one person (the person that makes the wave gesture).

  36. You can use the TrackingID of the skeleton to reference only one skeleton. So when you will browse all the active skeletons, you will only use the one with the correct Tracking ID.

  37. But hwo to select the right person, with tracking id, i mean, there can be 4 people in front of the camera, so which one should he track?

    And how do i import a gesture like waving, or like HelloRight or HelloLeft in your gestureviewer program.

    When this is tracked it should use this person as skeleton.

  38. You have to track all at start and select the first one who realized a gesture.

    Importing a gesture like waving need you to record it using the record button in the gestureviewer program

  39. after i record that, i have to put it in my program.

    Then i need  import your toolkit dll's in VS2010,

    What do i do next ? which methodes do i need, ?

  40. You can have a look to the code of GestureViewer: all you need is inside

  41. David i looked, I dont understand, where to begin.

    if you could help me out with lil snippets or something.

    I dont know how to use the trackingID for the first person,

      foreach (SkeletonData s in allSkeletons.Skeletons)

               {

                       if (s.TrackingState == SkeletonTrackingState.Tracked)

                       {

                         if (s.TrackingID == 0)

                           {

                               foreach (Joint joint in s.Joints)

                               {

    In your Gesture viewer, u use a canvas where u draw the gesture, is it necesairy to draw the gesture on the canvas? or can i do it, behind the application, so its not visible what is happening?

    Is it possible to direct mail with u?

    thx

  42. Hi friend, i need to ask u, how can i capture the gestures, because i actually dont understand it.

    If i make a gesture, and the red dots are beging made for me on the canvas. And i press Record, then what is recorded, nothing actually?

    i would like to import the whole alphabet, thats why, could u explain me thx:)

  43. After clicking on record, you must click on top when the gesture is complete. then the LearningMachine will add a new fragment

  44. xingguang says:

    hi,every one,I  want  to connect  kinect to nao robot ,but I do not know how to do it , anybody konws,please contact me .My email is chengzongxing@qq.com   .

  45. Abhilash says:

    Amazing contribution, David!

    Tons of Thanks :)