Are Ultrasound Gestures for Android the next big step?

What would your reaction be if I told you that you can play Temple Run on your device without touching the screen? And no, I am not kidding. Let me put some stress on the latter part of the previous sentence and articulate my opinion about ‘Ultrasound gesturing’. I believe it is time to bid farewell to infrared gesturing.


Though I can’t deny the fact that infrared (camera-based) gesturing was one heck of an invention, ultrasound gesturing is the latest fashion in this area of technology. This technology could be extended to laptops and tablets too, making life more simpler. We have witnessed something similar quite a long while ago, when Samsung’s Galaxy S3 was released. The Galaxy S4 took this to another level with AirView and other gimmicks. For this instance, if you consider ultrasound gesturing – you could flip the page and move on to the next one by just swiping your hand in the mid air just above the screen.

How This is Cool

Ultrasonic touch-less technology uses up to 95% less power than current camera image-based gestural systems, making it an attractive option for device manufactures around the world. Elliptic Labs has been working on this for a very long time now and has successfully developed the Software Development kit (SDK) for Android. It has already come up with a gesture suite for laptops running on Windows 8. It is speculated that Elliptic is building the gesture suite for laptops and tablets because they have large screens. It would be inappropriate to say that building gesture suite for smartphones is difficult, so I would put it this way – building SDK for smartphones and equipping the phone with the Ultrasound sensor is complex due to the size of the smartphones (maybe exclude phablets). The developers have claimed that this technology cannot be added to existing smartphones.

[quote_left] Ultrasound Gesturing is already on Laptops [/quote_left]

Ultrasound Gesturing is more precise in drawing conclusions relating to motion when compared to the conventional optical based sensors and gesturing systems. Now moving on deeper into the concept of Ultrasound Gesturing – a chip is installed in the device which has a ultrasound generator (making the sound inaudible) that spreads the mechanical waves all around the device. These waves cover an angular area of 180 degrees surrounding the device. So any gesture in this range will be detected by a sensor. When a gesture is made, it creates a disturbance in the orientation of the mechanical waves thus resulting in its detection. Thus, proceeding actions take place.

[quote_right] 95% less power consumption, and a lot more accuracy [/quote_right]

Elliptic labs is displaying its prototype – An Android-based smartphone equipped with ultrasound technology at CEATEC from 1st-5th October this year. Here is a short video that demonstrates how ultrasound works on smartphones and tablets.


On the other hand, Elliptic labs has developed a gesture suit for Laptops running Windows 8 which allows you to scroll up/down, open and close apps/windows, rotate, split screens, among many other things. If you want to have the latest updates for installing Ultrasound gesture for your laptop (Windows 8), click here