Touch input on Windows in XNA Game Studio 4.0

(this article brought to you with help from Brandon, from whose emails I shamelessly plagiarized much of what follows)

Some folks have been wondering why the Game Studio 4.0 CTP supported touch input on Windows 7 as well as Windows Phone, but our more recent beta and RTM versions only support touch on Windows Phone.

Why on earth would we implement this feature but then take it away at the last minute?

For one thing, our Windows touch implementation had some irritating flaws. Touch was only supported on Windows 7, not Vista or XP. This would have been our only feature that required Windows 7. And there were inconsistencies between the behavior of touch on Windows vs. the phone. For example pressing and holding on Windows caused a right-click mouse promotion that we had no way to disable.

But these were not enough to justify removing the feature altogether. The real problem came when we added touch gesture support. Some history:

  • We originally planned on just exposing raw touch point data, with no gesture support
  • We figured games could implement their own gesture recognition over the top of this raw data
  • Cue the start of our learning experience...
  • It turns out that implementing good gesture recognizers is extremely difficult!
  • It turns out that if everyone does this differently, you get a crappy user experience where apps never feel quite the same as the built-in system UI
  • It turns out that touch panels (like keyboards and gamepads) are complex, quirky pieces of hardware, and their outputs require significant massaging
  • It turns out that the best way to do this data massaging varies from one touch panel to another
  • So for best results, gesture recognition needs to be done at a low level in the stack, exploiting driver level knowledge of the hardware
  • Application level gesture recognition, layered on top of an abstract, hardware independent API, fundamentally cannot produce such good results

When these facts became clear, we decided it was important to add built-in gesture recognition to Windows Phone, in order to provide high quality, hardware optimized, and consistent functionality for all applications.

So that's what we did, in between our CTP and beta releases.

But what about the Windows version of the XNA Framework? We had three options:

  1. Implement gesture recognition for Windows as well as the phone
  2. Do not support gesture recognition on Windows, but keep the existing lower level TouchCollection API
  3. Cut touch from Windows entirely

The first option simply did not fit the schedule. We would have had to cut other work to make time for it, but decided that other work was more important.

The second option seemed too confusing and half-baked to be a good idea. "Touch works on Windows, but only if you have Windows 7, and only if you avoid this part of the API which doesn't work at all".  Yuck.

So we went with the third option. We hope to revisit this sometime in the future, but for now we felt it was better to ship nothing at all than something incomplete, inconsistent, and which we already knew we would have to revisit.

Where does this leave those who want to use touch in Windows XNA games? You can still use XNA for graphics, sound, etc, while reading touch input some other way:

  • Easy: using this .NET interop sample library from the MSDN code gallery, call Factory.CreateHandler<TouchHandler>(GameWindow.Handle), and subscribe to the TouchHandler TouchDown, TouchUp, and TouchMove events

  • Harder: hook Windows messages for GameWindow.Handle (using GetFunctionPointerForDelegate, then p/invoke to the Win32 Get/SetWindowLongPtr API), and listen for WM_TOUCH messages

Note that we still ship the Microsoft.Xna.Framework.Touch assembly for Windows and Xbox, although it is not referenced by the default project template and is just a stub implementation where TouchPanel.GetState always returns an empty collection. This can still be useful for cross platform games, as it avoids the need for #ifdefs around your touch input code.

Comments (14)

  1. SJ says:

    There goes a significant chunk of what I planned on doing for my Masters project. I'll look into the interop library I guess.

  2. Clive Goransson says:

    Good to see a post on this, cheers Shawn

  3. John Stricker says:

    Thanks for the information Sean. Its disappointing to not have the touch features for free in windows, but it really helps to understand why the decision was made.

  4. Mark LeMoine says:

    Makes sense as to why it was cut, but I was really looking forward to this feature in XNA 4. I won't be able to complete my game unless I have multitouch support (it's the main input source for the rhythm game I'm making), so I guess I can look into the interop library.

  5. ML says:

    When using the interop library, is there any way to programatically disable taps registering as left clicks and long taps registering as right clicks? I have left and right clicks in my game do different things than what the touch inputs do, so this is a bit of a dilemma. Thanks Shawn.

  6. ShawnHargreaves says:

    It is not possible to disable the touch -> mouse input promotion behavior on Windows.  That's one of the problems we had with our Windows touch implementation, and we were unable to  find any way to prevent this.

  7. ThatsGobbles says:

    Thanks Shawn, I just re-read the article from the beginning and saw that after you mentioned it. I guess I missed that section. I guess I'll have to figure out another solution then. 🙁

  8. Eric Wessen says:

    Hey Shawn, are there any plans to allow programatically disabling the automatic touch->mouse promotion in Windows, and updating the interop library to do so? The auto-promotion really screws up my game controls, to the point where I'm stuck.

  9. ShawnHargreaves says:

    Eric: I don't know any way to turn off that touch -> mouse promotion behavior on Windows.  Which means that Windows apps really need to use either mouse, or touch, but not try to read both at the same time.

  10. Stefano Baraldi says:

    Thanks for this very useful insight.

    I am developing WPF+native DirectX multi-touch apps, and i have to say Windows7 is really NOT a touch OS, at all.

    In order to have a good and modern data-bound interface over a smoking-fast rendering DirectX window, i implemented my own flavour of window layers.

    All of this just to discover what was clearly stated in the Win7 documentation.

    "only one window at a time can receive WM_TOUCH messages" !!!!

    That's like friendly fire imho.

    So… still waiting for Windows8….. while iPads, and OSX is way ahead…

    and i love Microsoft!



  11. crazy eltayeb says:

    i used the easy solution and i need to get the point where the touch happened so i can turn it to a vector2 object and use it later

  12. Joe says:

    Will this likely ever be fixed?  It would be good to know what the future plans are with XNA.

  13. Andrew says:

    @Joe, the future plan for XNA is MonoGame… which apparently implements touch on a lot of platforms including Windows, Mac, iOS and Android… iOS and Android appear to require a Xamarin framework (and hence payment)

  14. Steven says:

    The url is outdated now. 🙁

Skip to main content