Researchers have developed an open-source touch-sensing “skin” with sensors that can assist robots in quantifying layers of cloth to fold laundry
Humans perform their daily tasks very easily using their sense of sight and touch, like holding a glass or picking up a piece of cloth and folding it as they naturally possess tactile sensing. But, for robots, it’s very difficult to carry on such tasks as humans, as the data collected through touch is extremely difficult to quantify and the sense has been difficult to simulate in robotics. Therefore, researchers from Carnegie Mellon University’s Robotics Institute (RI) have utilized “ReSkin”, an open-source touch-sensing “skin” that is made of a thin, elastic polymer embedded with magnetic particles to analyze three-axis tactile signals. This enables the robot to feel layers of cloth rather than depending on its vision sensors to see them.
“By reading the changes in the magnetic fields from depressions or movement of the skin, we can achieve tactile sensing,” said Thomas Weng, a Ph.D. student in the R-Pad Lab, who worked on the project with RI postdoc Daniel Seita and grad student Sashank Tirumala. “We can use this tactile sensing to determine how many layers of cloth we’ve picked up by pinching with the sensor.”
Till now, researchers explored numerous ways to use tactile sensing to grab rigid objects, but the cloth is “deformable,” i.e. change in the shape of the material changes its pose and the sensor readings making the task more difficult. Hence, researchers were focused on teaching robots to count how many layers of fabric it was grasping by first estimating how many layers it was holding using the sensors in ReSkin, then adjusting the grip to try again. They experimented by making the robot pick up both one and two layers of cloth and used different textures and colors of cloth to demonstrate generalization beyond the training data. It was possible to teach the robots how to handle something as delicate as layers of cloth due to the thinness and flexibility of the ReSkin sensor
“The profile of this sensor is so small, we were able to do this very fine task, inserting it between cloth layers, which we can’t do with other sensors, particularly optical-based sensors,” Weng said. “We were able to put it to use to do tasks that were not achievable before.”
“It is an exploration of what we can do with this new sensor,” Weng said. “We’re exploring how to get robots to feel with this magnetic skin for things that are soft, and exploring simple strategies to manipulate cloth that we’ll need for robots to eventually be able to do our laundry.”