Yale researchers have developed a suit that can turn ordinary objects into robots capable of moving from place to place and grasping other objects.
The researchers, led by professor of mechanical engineering and materials science Rebecca Kramer-Bottiglio, created an elastic sheet with embedded sensors that wrap around soft inanimate objects to create robotic systems with various situationally dependent capabilities. They found that both the type of soft object used and the orientation of the skin system on the object affects the range of motion, making the device multifunctional. The study was published in Science Robotics on Sept. 19.
“Most robots are designed for a specific task, but a reconfigurable robot could potentially perform many tasks,” said Joran Booth, a postdoctoral fellow in the Kramer-Bottiglio group and co-author of the study. “The robotic skin concept is that they can be applied to, removed from and transferred between various host objects, so we can use the same hardware over and over again to generate many different configurations with many different functions.”
The team was able to demonstrate multiple applications in which the robotic skins could be successfully used to create robotic systems capable of locomotion, such as inchworm-like robots and movable stuffed horses, as well as systems capable of incorporating sensory data, such as a wearable posture-feedback shirt.
While the researchers demonstrated the skin’s ability to carry out motions such as grasping and other methods of locomotion like rowing, they noted that the paper is only meant to demonstrate a subset of possible applications, and that the range of applicable contexts may be much greater.
The researchers were initially inspired by a solicitation from the National Aerospace and Space Administration for soft robotic technologies for applications in outer space. Because cargo carried on outer space missions comes at a high cost per unit mass, the team aimed to minimize the weight and size of the robots involved, Booth said.
Robotic skins are a technology that would not only minimize the weight and size transported on missions but also reduce the number of robotic systems involved, he said.
While the multifunctional capabilities inherent to the robotic skins provide a significant logistical improvement on other contemporary robotic systems, they also provide unique challenges, according to Michelle Yuen GRD ’18 and co-author of the study.
“The biggest challenge we encountered was in integrating the different components of the robotic skin — sensors, actuators, substrate — such that they work together effectively,” she said.
The group now hopes to fine-tune already demonstrated capability pathways.
“We are currently looking into how to improve the accuracy of the robotic skins’ motions, by adapting their control strategies on demand,” Dylan Shah GRD ’23, co-author of the study, said.
Additionally, Shah added, the team hopes to develop robotic skins capable of more fully integrating sensing capabilities into mechanisms for actuators — the components of robots that control motion.
The team also plans to further develop the synthetic capabilities of the robotic skins demonstrated in the paper, where multiple sheets were utilized simultaneously in the same platform.
“Long term, we envision robots that can, for instance, elongate, grow limbs or add joints along their existing structure,” Shah said.
To do this, the Kramer-Bottiglio group will work on developing a system capable of modifying the passive soft skeleton as well. By providing flexibility in both the configuration of the soft skeleton and of the robotic skin system infrastructure, the researchers hope to further expand the range of operations able to be performed by a given robotic system.
“We are currently starting a new project which derives inspiration from sculptors to make robotic skins, which can actively sculpt the shape of the object they are applied to,” Shah said.
Czech playwright, novelist and journalist Karel Čapek coined the word “robot” in his 1920 hit play, Rossum’s Universal Robots.
Josh Purtell | josh.purtell@yale.edu .