The hands-on revolution

hamba

Inactive User
Joined
May 24, 2005
Messages
8,704
Reaction score
1,345
Location
Down Here
The hands-on revolution

The mouse and keyboard are getting old - but new touch-sensitive screens could give us a whole new way to work with computers.

It's more than two decades since the last computer interface revolution - the mouse - and now the next one is just around the corner. Slightly lessthan a year ago, Jeff Han, a researcher at the computer science department at New York University, stood up in Monterey at the annual TED (Technology, Engineering, Design) conference, a showcase for interesting new products and ideas, and began demonstrating his new big idea.

Standing in front of a 36" by 27" screen tilted towards him, he began demonstrating how he could operate a computer using just his hands. The crowd whooped. "I think this is going to change the way we interact with computers," Han said. (The 10-minute video can be seen online here.)

Future pointers


Then last week Steve Jobs, chief executive of Apple, stood in front of an adoring crowd and showed off the iPhone. "Are we going to use a stylus?" asked Jobs rhetorically. "No, we're not. We're going to use the best pointing device in our world. We're born with 10 of them: our fingers."

He demonstrated how he could, with a finger, flick through album art in its iPod capability, and then view a photo, and by pinching his fingers together on the screen make it smaller, or by drawing them apart enlarge it. (The iPhone demonstration can be seen via apple.com/iphone.)

Both Han and Jobs were showing off "multitouch" screens. But what makes those two demonstrations, 11 months apart, so remarkable is that they use two different technologies, on two screens of such different sizes, to identically dramatic effect. The upshot: this could be the start of a revolution in how we interact with computers.

Jobs claims that Apple's "Multitouch" is more accurate than any standard touch display, that it ignores unintended touches and particularly lets you use more than one finger. "And boy, have we patented it," he added.

Following Jobs's speech, Han said: "The iPhone is absolutely gorgeous, and I've always said, if there ever were a company to bring this kind of technology to the consumer market, it's Apple."

Bruce Tognazzini, one of the original Apple team who is now a user interface expert, says the idea isn't new. "William [Bill] Buxton was pushing multi-hand input back in the 1980s," he notes on his blog (asktog.com).
"Several researchers were experimenting with gestural interfaces in [the] 1990s, myself included. I was reminded of this only minutes after Steve's speech when my partner, Jakob Nielsen, called me to say, 'Jobs just announced your pinch interface!' ..." (Tognazzini enunciated the idea of "pinching" onscreen elements in a book in 1996.)

But, he adds: "What's important [about the iPhone] is that, for the first time, so many great ideas and processes have been assembled in one device and made accessible to normal human beings. That's the genius of Steve Jobs; that's the genius of Apple."

And it's certainly time for new ways to talk to your computer. When I spoke to Tognazzini in 2000, he remarked that the interface hadn't moved on since 1984: "You only have a single way to speak to the computer at present, which is the mouse click," he said. But he'd seen work at the University of Maryland: "You could use a gestural interface, so that to copy you would put two fingers on the object on the screen - a screen that would be on your desk, because it's tiring to hold your hands up to a vertical screen all day - and then pull them apart ... now, that would be a big improvement. And you could mix those with gestures and voice recognition." It sounds just like multi-touch systems.

But revolutions, especially in the computer field, can't happen until all the pieces are in place. The mouse was the first interface revolution, but had to wait for computers to catch up.

Replacing the proxy finger of the mouse with the real thing, or at least a handheld stylus, seemed like the GUI's obvious next development. That was the aim of Apple's Newton - a handheld computer launched in 1993 that, it was claimed, could recognise handwriting. (It could, but not necessarily the writer's.) The Palm interface, with its stylised "Graffiti" input via a stylus initially did well, but did not take over the world; people stuck with buttons on telephones.

Nor have styluses worked for bigger computers; despite Bill Gates's forecast on the launch of the tablet PC in 2001 that "within five years it will be the most popular form of PC sold in America", they make up only about 1% of the market.

Multi-touch systems could be the real revolution, though, by letting us do what we're good at - working with both our hands. For example, to open a folder and then sort the photos or documents in it by date or name, you could do it the long way, using the mouse, pointing, clicking, pointing, clicking. But with a two-handed screen-driven system, you'd tap on the folder and sort the items directly by hand. The closest we have at present is Opera's "mouse gestures" (also available in Firefox via an extension) which lets the user define actions with predefined mouse movement (such as closing a window when you inscribe a circle). But it's some way short of the potential of multi-touch screens.

Apple's technology comes from a company called Fingerworks, which had been making multi-touch sensitive pads. Han's group has released engineering drawings of their system, which uses a semi-transparent screen with the computer output projected on to it from below. The position of a touch on the screen - to an accuracy of better than 2.5mm - is detected by its effect on the screen's reflectivity, measured by light bounced off its underside. But will it stand up to intensive use? Yann LeCun, the professor overseeing Han's work at NYU, says: "Jeff's technology is actually pretty robust. The sensing technology is optical, hence the sensor is not subject to wear and tear. The surface material on the screenis cheap and easily replaceable.

"The only thing that the touchscreen is not particularly good for is typing text," says LeCun. "For everything else, multi-touch is more convenient and more natural than a mouse or a stylus (including reading text). Also, using your hand and fingers is nice with a large screen, but not so nice for small screens. Jeff's technology is geared towards large-screen formats."

Embrace the waves

But isn't it tiring standing up waving your hands around? "It can be," LeCun agrees. He adds: "The idea of multi-touch interaction has been around for a long time. It's a pretty natural idea, really. The real question was a technological one: what sensing/display technology could make multi-touch practical, durable and cost-effective? Jeff's technology is the first one that demonstrates the practicality of large multi-user/multi-touch sensors, with essentially no limit in size."

Of course, rather as it took a while for software developers to respond to the eagerness of users to embrace a GUI, the success of the two-handed or two- fingered screen will depend on how quickly applications appear that make the most of them. Those struggling with the ever-multiplying cascade of torn-off menus that programs such as Adobe Photoshop offer might cheer. The only question now is how quickly it will happen. Perhaps all the tablet needs to be a success is to lose its stylus - and grasp the future with both hands.









Charles Arthur
Thursday January 18, 2007
Guardian Unlimited
Guardian News and Media Limited 2007
 
Back
Top