New 3D tactile sensing modules bring human-like touch to robotic hands, enabling precise force detection, adaptive grip, and real-time interaction for next-generation intelligent automation systems.

A new generation of tactile sensing technology is pushing robotic hands closer to human-like dexterity, addressing one of robotics’ long-standing limitations—the ability to “feel” and respond to physical interaction in real time.
At the core of this advancement by Melexis is a magnetic-based 3D tactile sensing approach that has now transitioned from prototype to industrial-ready fingertip modules. These compact modules are designed to be embedded directly into robotic fingers, enabling machines to detect force, pressure, and motion across multiple axes with high precision.

Unlike traditional robotic systems that rely heavily on vision and predefined motion control, this technology introduces real-time tactile feedback. It converts physical contact into actionable data, enabling robots to adapt dynamically when handling objects with varying shapes, textures, or levels of fragility.
The sensing system is based on magnetic principles, enabling accurate measurement of both normal and shear forces—critical for tasks requiring fine motor skills. This capability enhances grasp stability, reduces slippage, and allows robots to interact more safely and effectively in unstructured environments.
The key features are:
- 3D magnetic tactile sensing for multi-axis force detection
- Real-time conversion of touch into actionable data
- Detection of both normal and shear forces
- Integrated fingertip modules for easy system deployment
- Enhanced grip control, dexterity, and adaptive interaction
A key innovation lies in its packaging: instead of discrete sensors requiring complex integration, the solution is delivered as application-ready fingertip modules. This significantly reduces development complexity for robotics manufacturers and accelerates time-to-market for advanced robotic systems.
The technology has already been integrated into next-generation robotic hand platforms, demonstrating improved responsiveness and adaptive manipulation. Development has involved full-stack optimisation—from mechanical design to calibration—highlighting a shift toward tightly coupled hardware-software co-design in robotics.
More broadly, this advancement reflects a growing industry push toward “physical AI,” where robots combine perception, cognition, and interaction. While vision systems have matured rapidly, tactile sensing has remained a bottleneck. By enabling true 3D touch perception, this new approach helps bridge that gap and unlocks more natural, human-like interaction between machines and the physical world.
As robotics moves into domains such as collaborative manufacturing, healthcare, and service automation, the ability to sense and interpret touch will be critical. Industrialised tactile modules mark a step toward scalable deployment of intelligent, touch-enabled robotic systems.
Click here for the original announcement.




