A sonar-based AI system that enables smartwatches to track full hand and finger movements in 3D, potentially transforming wearable interaction beyond tiny touchscreens.

A new generation of AI-powered smartwatches may soon eliminate the need for tapping tiny screens, thanks to a sonar-based hand-tracking system that turns existing wearables into gesture-sensing devices.
Researchers from Cornell University and KAIST, behind the “WatchHand” system, have demonstrated continuous 3D hand pose tracking using only the built-in speaker and microphone of standard smartwatches—no additional hardware required. The system emits inaudible sound waves that bounce off the user’s hand, with returning echoes processed by a deep-learning model to reconstruct finger positions in real time.
The technology can track up to 20 finger joints with a mean error of under 8 mm, bringing near-precise gesture control to devices traditionally limited by small touch displays. This marks a significant leap from earlier interaction methods, which relied heavily on touch or required external sensors.
The approach builds on prior sonar-based innovations such as “FingerIO,” which enabled devices to detect finger motion in mid-air by analysing reflected sound waves. However, WatchHand advances the concept by enabling full hand pose estimation rather than simple motion tracking.
At its core, the system leverages acoustic sensing—an emerging technique where ultrasonic signals create a “sound field” around the device. When a user moves their hand, the disturbances in this field are captured and interpreted using AI models. Similar concepts have shown gesture recognition accuracy exceeding 90% in controlled environments.
One key advantage is that the system works even without line of sight, unlike camera-based tracking. Sound waves can pass through obstacles like clothing, enabling interaction even when the smartwatch is partially covered.
The implications extend beyond convenience. Researchers suggest such technology could enable touchless control during workouts, accessibility features for users with disabilities, and more immersive interfaces for AR and wearable computing.
While performance may still vary across users and environments, lightweight model tuning helps adapt the system to new conditions.
By removing the need for specialised sensors, WatchHand lowers the barrier for widespread adoption—potentially bringing advanced gesture control to millions of existing smartwatches.



