Both halves of the NielsenNorman group have had a swipe (sic) at gestural user interfaces: Nielsen in his Alertbox and Norman in his column in Interactions and on his web site.
Norman's main concern is that beyond a core set of well-understood gestures (move up/down, move forward/back, shrink/enlarge, and shake to change) gestures are arbitrary and have to be learned. Many of these more arbitrary gestures don't have an obvious converse, so can leave the user stranded if they make a mistake (although to be fair to Android interfaces, of which he is critical, there's always the 'back' key). Norman identifies the need for clear graphical correlates on the display for gestures to be discoverable and usable. I think he's right. So no display 'real estate' efficiencies in gestures then.
09 June 2010
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment