The way people interact with their smartphones and tablets has always been crucial to their buying decision of the devices. The user interfacte or UI of a device is a crucial driver and differentiator in the adoption of an ecosystem or a particular brand of devices. Competition in the smartphone and tablet market has led to a flurry of devices with similar or identical specs, which is why UI provides the much needed distinction between brands.
Microsoft seems to have realised this fact and is working towards upping the ante in the user experience space with its new 'Pre-Touch Sensing' technology. The Redmond based tech giant has demonstrated this new feature, in a recent YouTube video. Honestly, it's one of the coolest things Microsoft has showed off in the UI space in a while.
Pre-Touch Sensing: The What?
As the name suggests, Pre-Touch Sensing is a way with which a smartphone or a tablet anticipates the activity of touch, when fingers approach the device. it also gives a smartphone the ability to sense how users grip their devices to adapt interfaces on the go. Pre-Touch will make using a mobile phone easier and faster, by bringing up intiutive menus even before the user starts interacting with the display. In a research paper titled 'Pre-Touch Sensing for Mobile Interaction', Microsoft researchers say, "Using pre-touch in an anticipatory role affords an ad-lib interface that fades in a different UI—appropriate to the context—as the user approaches one-handed with a thumb, two-handed with an index finger, or even with a pinch or two thumbs. Pre-touch also enables hybrid touch + hover gestures, such as selecting an icon with the thumb while bringing a second finger into range to invoke a context menu at a convenient location." Along with anticipatory reactions, Pre-Touch Sensing also facillitates retroactive interpretations that can smartly differentiate between tap and drag events "based on whether the user approached the screen in a ballistic motion, or with a finely adjusted trajectory, allowing on-contact discrimination between flick-to-scroll vs. text selection." As far as Grip sensing is concerned, intelligent sensors that make up the Pre-Touch Sensing technology can determine whether the user is holding a device in a particular hand and in a particular position. For example, when a user holds their phone in a way that they would while taking a picture, the sensors will automatically bring up the viewfinder. In another scenario, holding the phone in a particular orientation for too long will be read as a user's preference and instigate an orientation lock, eliminating the need to put it on/off repeatedly.
Grip sensing with contextual menus
Pre-Touch Sensing: The Why?
As Microsoft puts it, "most of what is characterised as 'touch' starts before contact and originates from beyond the confines of the screen." Here, they explain how a user generally interacts with a smartphone. Most of use pick up our smartphonea with the left or the right hand, post which we usually use an index finger and a thumb to navigate the interface. Because of this core usage pattern, reserachers can now predict the intent of the touch before it happens. This could be a huge step forward by Microsoft in competing with Apple's 3D Touch technology, not just by eliminating the need to open and navigate apps, but also by bringing up contextual information while using them.
Pre-Touch Sensing: The Where?
While there is no official word from Microsoft on the implementation of this technology, it is expected that future Microsoft devices like the Surface Phone may include Pre-Touch Sensing, which could give the device an added lure for buyers. Contrastingly, the company could also decide to commercialise the technology, which could see it going beyond Microsoft devices. Obviously, as trend has it, any new smartphone technology is only good as the first device that implements it. Microsoft has definitely taken the first steps, but they would also need to become the first movers in the market, if they want to reap the fruits of this brilliant technology.