, DeathBy​VisualStudio wrote

The only thing confused here is you Andy.

  • Stylus: acts like a mouse today, high precision -- show "x"
  • Kinect: low precision -- don't show "x" (Pretty disingenuous of you to throw this in your list of "concerns")
  • Mouse with touch: high precision & low precision -- show "x" if the bar is brought up with the mouse, don't show the "x" if the menu is brought up by touch.

You see, you're missing the subtleties. Let's say I'm holding a tablet in my left hand and using the stylus in my right. I swipe from the corner to bring up the switch list and want to swap to an app in the list near to where my left hand is resting. It's much, much easier to use my thumb to click than to contort your arms to do it with the stylus. Except now you've put these close buttons there and we're back to accidental closures. And that's assuming you're using a tablet-style direct-to-display stylus, as opposed to a separate desktop one where there is a visual disconnect - you're now forcing the user to employ a degree of accuracy otherwise unnecessary to switch tasks (the Taskbar doesn't do that, for example)

Similarly with the mouse with touch, the whole point is to allow quick gestures which can bring up system or app UI, but your hand is still on the mouse and you still might want to use that to actually point/select. Except now you've made the close function you wanted more accessible disappear, so the user either has to give up the touch gestures or go back to the right-click menu that is apparently too difficult. More exotic devices like Kinect only increase the number of UI combinations, none of which are necessarily going to be used in isolation any more than you'd use the mouse in isolation from the keyboard.

The W8 desktop taskbar already implements this differentiating between mouse, pen, and touch currently. When touch is being used it displays the "x" for all thumbnails. The only change required would be to avoid the latter when touch is in use -- you know to avoid accidental closures by Microsoft apologists.

And it just doesn't do it well. Trying to fudge touch support onto a UI designed wholly around the idea of being mouse driven is inevitably going to fail. That's kind of the whole reason why Windows 7 touch devices aren't ubiquitous already. It's why Steven Sinofsky said in the Building 8 blog all those months ago: "Going back to even the first public demonstrations of Windows 7, we worked hard on touch, but our approach to implementing touch as just an adjunct to existing Windows desktop software didn't work very well. Adding touch on top of UI paradigms designed for mouse and keyboard held the experience back."

And let's not forget that the W8 desktop taskbar can be oriented vertically on the left or right side of the screen...

I have absolutely no idea what point you're trying to make with that.