Multitouch Part 3: Multitouch in managed code and WPF

After yesterday’s post on developing with gestures, there may have been wailing and gnashing of teeth that all of the code samples were C++.  Well, today we will discuss multitouch in managed code. 

Now, a little trip down memory lane…remember that Windows 7 was released before .NET 4.0.  Therefore, there were multitouch resources made available for developers at the launch of Windows 7 (that would work with .NET 3.5).  Then, when .NET 4.0 was released (with WPF as a part of that), a lot of multitouch support was moved and baked in WPF 4.0.  Therefore, it may sometimes be a little confusing to work with multitouch since there are differences in the before and after .NET 4.0 worlds.   

Pre-.NET 4.0

Since not everyone is on .NET 4.0 yet, I do want to cover what you can do with .NET 3.5.  There are some great resources at https://code.msdn.microsoft.com/WindowsTouch.  Specifically, check out the Windows 7 Multitouch .NET Interop Sample Library.  It provides developers with full multitouch functionality for both managed WinForms and WPF 3.5 SP1.  This library contains a few demos for reference, including detailed samples showcasing multitouch gesture support, as well as manipulation and inertia for both managed WinForms and WPF.

In the sample library, there is a class called Windows7.Multitouch.Handler.  This is an abstract base class for the derived classes TouchHandler and GestureHandler.  A form can have one handler, either a touch handler or a gesture handler.  The form will need to create the handler and register to events. 

You can use Factory to create one of the handlers.  This can be done slightly differently depending on the type of touch support that you need: 

For Windows Forms, managed Win32 hWnd, and WPF gesture support, the handler wraps the Window (hWnd).  In the below code snippet (taken from the mtWPFGesture project in the Interop Sample Library), the constructor of the MainWindow first checks for multitouch support, calls Factory.CreateGestureHandler() to create a gesture handler, and then wires up event handlers on all of the gesture events on the gesture handler. 

 private readonly Windows7.Multitouch.GestureHandler _gestureHandler;

public MainWindow()
{
    InitializeComponent();

    if (!Windows7.Multitouch.TouchHandler.DigitizerCapabilities.IsMultiTouchReady)
    {
        MessageBox.Show("Multitouch is not available");
        Environment.Exit(1);
    }

    Loaded += (s, e) => {  };

    _gestureHandler = Factory.CreateGestureHandler(this);

    _gestureHandler.Pan += ProcessPan;
    _gestureHandler.PanBegin += ProcessPan;
    _gestureHandler.PanEnd += ProcessPan;

    _gestureHandler.Rotate += ProcessRotate;
    _gestureHandler.RotateBegin += ProcessRotate;
    _gestureHandler.RotateEnd += ProcessRotate;

    _gestureHandler.PressAndTap += ProcessRollOver;

    _gestureHandler.TwoFingerTap += ProcessTwoFingerTap;

    _gestureHandler.Zoom += ProcessZoom;
    _gestureHandler.ZoomBegin += ProcessZoom;
    _gestureHandler.ZoomEnd += ProcessZoom;

}

For WPF touch support, use the stylus event with the help of the Factory.EnableStylusEvents() method.  In the below code snippet (taken from the mtWPFScratchPad project in the Interop Sample Library), the constructor of the MainWindow first checks for multitouch support, enables touch events when the window is loaded, and wires up event handlers for the stylus events. 

 public MainWindow()
{
    InitializeComponent();
    
    if (!Windows7.Multitouch.TouchHandler.DigitizerCapabilities.IsMultiTouchReady)
    {
        MessageBox.Show("Multitouch is not available");
        Environment.Exit(1);
    }

    Loaded += (s, e) => { Factory.EnableStylusEvents(this); };

    StylusDown += OnTouchDownHandler;
    StylusMove += OnTouchMoveHandler;
    StylusUp += OnTouchUpHandler;
}

 

.NET 4.0

With the release of .NET 4.0, more multitouch support was added directly into WPF. 

First of all, there were updates to UIElement, UIElement3D, and ContentElement to support multitouch.  These classes now support various touch events, such as TouchDown, TouchUp, TouchMove, TouchEnter, and TouchLeave.  To handle raw touch in WPF 4.0, you simply respond to these events. 

Additionally, the UIElement in .NET 4.0 supports manipulation.  You must opt-in for manipulation by setting IsManipulationEnabled=true.  Through an event handler on the ManipulationDelta event, you can appropriately scale, rotate, or translate the UIElement.  That effectively gives you the Zoom, Rotate, and Pan/Translate gestures, respectively. 

There is support for inertia as well.  Inertia adds basic physics to a manipulation, so the UIElement will slow gradually when flicked and look more natural.  To enhance your application with inertia, wire up an event handler to the ManipulationInertiaStarting event.  In the event handler, specify the inertial behaviors that you want; you can specify ExpansionBehavior, TranslationBehavior, and RotationBehavior.  For each behavior, set some initial values (such as InitialVelocity) and the desired effect (such as DesiredDeceleration). 

Next, there was multitouch support added to a number of different controls.  For instance, ScrollViewer was updated to accept pan gestures, so you can “flick” a list with a scrollbar to scroll through it.  The base controls were updated to be multitouch-aware.  There is multi-capture support, to be able to recognize more than one finger simultaneously.  Finally, there are new multitouch-specific controls, such as the ScatterView (which was previously part of the Surface SDK). 

Finally, WPF 4 is compatible with the Surface SDK 2.0. 

Stay tuned for tomorrow’s post, when I’ll discuss multitouch in Silverlight. 

Other blog posts in this series:

Multitouch Part 1: Getting Started with Multitouch in Windows 7

Multitouch Part 2: Support for Gestures in Windows 7

Multitouch Part 3: Multitouch in managed code and WPF

Multitouch Part 4: Multitouch in Silverlight

Multitouch Part 5: User Experience with Multitouch