Sunday, April 21, 2024

Gesture Recognition Beckons The Next Wave Of UI

- Advertisement -

Gesture-development platforms
Soft Kinetic’s iisu is a platform for natural gesture development and deployment, which enables full-body skeleton tracking as well as precise hand and finger tracking. It also supports legacy cameras such as the original Kinect, Asus Xtion, Mesa SR 4000, Panasonic D-Imager and, of course, Soft Kinetic DepthSense 311 and DepthSense 325 (aka Creative Senz3D) as well as Flash, Unity, C# and C++ environments.

They are also making available their DS536A module for prototyping and development to select customers. DS536A module combines a time-of-flight sensor and diffused laser illumination and a 3-axis accelerometer. It has a lower resolution and narrower field-of-view, and has been available since September 2015.

OPT8140 from Texas Instruments is a time-of-flight sensor. It can be seen in DepthSense 325 camera as well as Creative Senz3D camera.

- Advertisement -

MLX75023 sensor, featuring Soft Kinetic technology and manufactured in Melexis’ automotive-grade complementary metal oxide semiconductor mixed-signal process, is claimed to be the highest-resolution 3D sensor available for automobile safety and infotainment markets.

OnTheGo Platforms is a firm that built a gesture-recognition system that can work on any mobile device with a standard camera.

Atheer is another firm that has a similar system but relies on infrared light. Interestingly, Atheer acquired OnTheGo so that a combination of their technologies could allow them to bring out a much more capable product.

Sony has SmartEyeGlass available for developers and they aim to tap the augmented reality segment.

Eyefluence is a company that recently came out of stealth with a ` 950 million funding. They have created a unique eye-tracking system that can understand gestures from the eyes in real-time.

What is up next

An Apple insider claims that the company has a new patent for 3D gesture control. The technology lets a computer identify hand motions made by a user. More interestingly, it can also learn gestures such that it can spot these even if part of the gesture is blocked from the camera.

This could enable a much more consumer-friendly response where gestures made in non-ideal conditions will still be detected and acted upon. If implemented perfectly, it could be as significant as the touch response that the original iPhone delivered in a time of frustrating touch-enabled devices.

Sensing through materials is something that cameras and vision cannot manage by themselves. Google seems to have taken a lead here with their Advanced Technologies and Projects (ATAP) project. Named Project Soli, this looks like the first realistic solution here.
Google’s solution runs at 60 gigahertz to capture anywhere between 3000 frames per second and 10,000 frames per second. A device made from a solution like this can measure the motion of the finger and hands in free space with so much accuracy that even rubbing fingers together would be detected with precision.

Dilin Anand is a senior assistant editor at EFY. He is B.Tech from University of Calicut, currently pursuing MBA from Christ University, Bengaluru


Unique DIY Projects

Electronics News

Truly Innovative Tech

MOst Popular Videos

Electronics Components