(this article brought to you with help from Brandon, from whose emails I shamelessly plagiarized much of what follows)
Some folks have been wondering why the Game Studio 4.0 CTP supported touch input on Windows 7 as well as Windows Phone, but our more recent beta and RTM versions only support touch on Windows Phone.
Why on earth would we implement this feature but then take it away at the last minute?
For one thing, our Windows touch implementation had some irritating flaws. Touch was only supported on Windows 7, not Vista or XP. This would have been our only feature that required Windows 7. And there were inconsistencies between the behavior of touch on Windows vs. the phone. For example pressing and holding on Windows caused a right-click mouse promotion that we had no way to disable.
But these were not enough to justify removing the feature altogether. The real problem came when we added touch gesture support. Some history:
- We originally planned on just exposing raw touch point data, with no gesture support
- We figured games could implement their own gesture recognition over the top of this raw data
- Cue the start of our learning experience...
- It turns out that implementing good gesture recognizers is extremely difficult!
- It turns out that if everyone does this differently, you get a crappy user experience where apps never feel quite the same as the built-in system UI
- It turns out that touch panels (like keyboards and gamepads) are complex, quirky pieces of hardware, and their outputs require significant massaging
- It turns out that the best way to do this data massaging varies from one touch panel to another
- So for best results, gesture recognition needs to be done at a low level in the stack, exploiting driver level knowledge of the hardware
- Application level gesture recognition, layered on top of an abstract, hardware independent API, fundamentally cannot produce such good results
When these facts became clear, we decided it was important to add built-in gesture recognition to Windows Phone, in order to provide high quality, hardware optimized, and consistent functionality for all applications.
So that's what we did, in between our CTP and beta releases.
But what about the Windows version of the XNA Framework? We had three options:
- Implement gesture recognition for Windows as well as the phone
- Do not support gesture recognition on Windows, but keep the existing lower level TouchCollection API
- Cut touch from Windows entirely
The first option simply did not fit the schedule. We would have had to cut other work to make time for it, but decided that other work was more important.
The second option seemed too confusing and half-baked to be a good idea. "Touch works on Windows, but only if you have Windows 7, and only if you avoid this part of the API which doesn't work at all". Yuck.
So we went with the third option. We hope to revisit this sometime in the future, but for now we felt it was better to ship nothing at all than something incomplete, inconsistent, and which we already knew we would have to revisit.
Where does this leave those who want to use touch in Windows XNA games? You can still use XNA for graphics, sound, etc, while reading touch input some other way:
- Easy: using this .NET interop sample library from the MSDN code gallery, call Factory.CreateHandler<TouchHandler>(GameWindow.Handle), and subscribe to the TouchHandler TouchDown, TouchUp, and TouchMove events
- Harder: hook Windows messages for GameWindow.Handle (using GetFunctionPointerForDelegate, then p/invoke to the Win32 Get/SetWindowLongPtr API), and listen for WM_TOUCH messages
Note that we still ship the Microsoft.Xna.Framework.Touch assembly for Windows and Xbox, although it is not referenced by the default project template and is just a stub implementation where TouchPanel.GetState always returns an empty collection. This can still be useful for cross platform games, as it avoids the need for #ifdefs around your touch input code.