HomeElectronics NewsHuman-Like Touch Advances Robotic Hands

Human-Like Touch Advances Robotic Hands

New 3D tactile sensing modules bring human-like touch to robotic hands, enabling precise force detection, adaptive grip, and real-time interaction for next-generation intelligent automation systems.

Robotic Hand

A new generation of tactile sensing technology is pushing robotic hands closer to human-like dexterity, addressing one of robotics’ long-standing limitations—the ability to “feel” and respond to physical interaction in real time.

- Advertisement -

At the core of this advancement by Melexis is a magnetic-based 3D tactile sensing approach that has now transitioned from prototype to industrial-ready fingertip modules. These compact modules are designed to be embedded directly into robotic fingers, enabling machines to detect force, pressure, and motion across multiple axes with high precision.

Robotic Hand

Unlike traditional robotic systems that rely heavily on vision and predefined motion control, this technology introduces real-time tactile feedback. It converts physical contact into actionable data, enabling robots to adapt dynamically when handling objects with varying shapes, textures, or levels of fragility. 

The sensing system is based on magnetic principles, enabling accurate measurement of both normal and shear forces—critical for tasks requiring fine motor skills. This capability enhances grasp stability, reduces slippage, and allows robots to interact more safely and effectively in unstructured environments. 

- Advertisement -

The key features are:

  • 3D magnetic tactile sensing for multi-axis force detection
  • Real-time conversion of touch into actionable data
  • Detection of both normal and shear forces
  • Integrated fingertip modules for easy system deployment
  • Enhanced grip control, dexterity, and adaptive interaction

A key innovation lies in its packaging: instead of discrete sensors requiring complex integration, the solution is delivered as application-ready fingertip modules. This significantly reduces development complexity for robotics manufacturers and accelerates time-to-market for advanced robotic systems. 

The technology has already been integrated into next-generation robotic hand platforms, demonstrating improved responsiveness and adaptive manipulation. Development has involved full-stack optimisation—from mechanical design to calibration—highlighting a shift toward tightly coupled hardware-software co-design in robotics. 

More broadly, this advancement reflects a growing industry push toward “physical AI,” where robots combine perception, cognition, and interaction. While vision systems have matured rapidly, tactile sensing has remained a bottleneck. By enabling true 3D touch perception, this new approach helps bridge that gap and unlocks more natural, human-like interaction between machines and the physical world. 

As robotics moves into domains such as collaborative manufacturing, healthcare, and service automation, the ability to sense and interpret touch will be critical. Industrialised tactile modules mark a step toward scalable deployment of intelligent, touch-enabled robotic systems.

Click here for the original announcement.

Akanksha Gaur
Akanksha Gaur
Akanksha Sondhi Gaur is a journalist at EFY. She has a German patent and brings a robust blend of 7 years of industrial & academic prowess to the table. Passionate about electronics, she has penned numerous research papers showcasing her expertise and keen insight.

SHARE YOUR THOUGHTS & COMMENTS

EFY Prime

Unique DIY Projects

Electronics News

Truly Innovative Electronics

Latest DIY Videos

Electronics Components

Electronics Jobs

Calculators For Electronics