By October 22, 2008

Microsoft’s SideSight, multi touch without touching

image SideSight technology is contained in a paper that Microsoft company executives are presenting at the User Interface Software and Technology conference this week.  The paper in question is titled “SideSight: Multi-“touch” Interaction Around Small Devices,” and is authored by Alex Butler, Shahram Izadi, and Steve Hodges, all with Microsoft Research UK. Touch was a revolutionary concept when it debuted with the iPhone, in part because it was implemented so well with gestures. Pinching, sliding and tapping the iPhone and iPod touch all directly impact the interface. SideSight removes “touch” from the device and makes it a function of the paper, tabletop, or even the air that’s next to the device. What does this mean? According to Microsoft, it opens up the possibility for “touch” functions to be built into tiny devices that don’t actually need a touchscreen.

“Despite the flexibility of touchscreens, using such an input mode carries a number of tradeoffs,” the paper’s authors wrote. “For many mobile devices, e.g. wristwatches and music players, a touchscreen can be impractical because there simply isn’t enough screen real estate. With a continued trend for ever-smaller devices, this problem is being exacerbated. Even when a touch-screen is practical, interacting fingers will occlude parts of the display, covering up valuable screen pixels and making it harder to see the results of an interface action.”

image So what can you actually do with SideSight? Quite a bit, as it turns out. By twisting one’s hands appropriately on either side of the phone, objects could be rotated in place. Pages could be panned and scrolled by moving a hand up and down, and Microsoft also proved that text could be entered and edited on the main screen through a stylus while the other hand scrolled the page — a movement that would be akin to the motions a user’s hands would make if he or she were writing on a sheet of paper.

A quick motion toward the device could also be interpreted as a “click,” according to Microsoft.

The key is a row of tiny optical sensors that look “outside” the device. In a prototype Microsoft built for the paper, the researchers took a HTC Touch mobile phone, and augmented it with two linear arrays of discrete infrared (IR) proximity sensors, specifically ten Avago HSDL-9100-021 940nm IR proximity sensors spaced 10 millimeters apart. Although only the sides of the phone were enhanced, the entire periphery of a device could include these sensors, the researchers said. The sensors can read inputs up to 10 centimeters away, just through reflected infrared light.

Individual fingers are sensed as a “blob” by the sensor array. One problem: users tend to drift one or more fingers into the area covered by the sensor field, the authors noted. Because they were unable to consistent determine which fingers were actively controlling the device and which were simply incidental, Microsoft decided to only look for a single finger, and use that to control the phone.

(The authors noted as well that the sensors weren’t directly connected to the phone. Instead, they were connected via USB to a PC, and then to the phone via Bluetooth. The convoluted interface reduced the effective sensing capability to 11 frames per second, a limitation of the test rig and not the circuits.)

What does the future of SideSight look like? Improved power consumption, improved sensor range, and an enhanced prototype: “In the future we believe that it may be possible to print or-ganic electronic versions of such sensors, and so we are also interested in exploring a SideSight configuration that has the entire casing covered in this type of proximity sensing material,” the Microsoft Research employees wrote.

via Gearlog

Technorati Tags: ,,
Posted in: Phones

About the Author:

Seasoned tech blogger. Host of the Tech Addicts podcast.
Loading Facebook Comments ...

Post a Comment

No Trackbacks.