July 27, 2024
Moving humanoid robots beyond research labs: The advancement of the immersive iCub3 avatar system

Moving humanoid robots beyond research labs: The advancement of the immersive iCub3 avatar system

The iCub3 system, developed by a research team at the Artificial and Mechanical Intelligence (AMI) lab at the Istituto Italiano di Tecnologia (IIT), has made significant progress in bringing humanoid robots out of the research lab and into real-world scenarios. The system, which has been tested in various applications, allows a human operator to remotely control the robot and interact with its surroundings.

The development of the iCub3 system has been documented in a research paper published in Science Robotics. The paper highlights the challenges encountered during the development process and the solutions implemented to overcome them. It emphasizes the importance of extending research beyond the confines of the laboratory and addressing the complexities of real-world environments in order to create a robust humanoid robotic platform that can be integrated into the economic and productive systems.

The iCub3 system has undergone further evolution and has resulted in the creation of a new robot, the ergoCub. The ergoCub robot has been designed to be compatible with various work environments and to maximize its acceptability in those environments.

The research group at the AMI lab, led by Italian researcher Daniele Pucci, consists of approximately 50 researchers. The iCub3 avatar system is composed of the iCub3 robot, which is one of the latest versions of the iCub humanoid robot developed at IIT two decades ago, and wearable technologies called iFeel. The iFeel suit tracks the body motions of the human operator, allowing them to control the robot’s movements and interact with its surroundings. The aim of the avatar system is to enable human operators to embody humanoid robots, including their locomotion, manipulation, voice, and facial expressions, while providing comprehensive sensory feedback through visual, auditory, haptic, weight, and touch modalities. The system has been developed in collaboration with the Italian National Institute for Insurance against Accidents at Work (INAIL).

The iCub3 avatar system has been tested and refined in three different real-world scenarios. In the first test, which took place in November 2021, a human operator in Genoa controlled the avatar at the Biennale di Venezia in Venice, which is approximately 300 kilometers away. The operator had the opportunity to remotely visit the Italian art exhibition using the avatar system. Ensuring stable communication between the two sites within a limited testing timeframe was a significant challenge. Additionally, the robot’s movement and interaction with the delicate art exhibitions needed to be cautious and safe. The iFeel suit tracked the operator’s body motions, which were then translated into actions performed by the iCub3 robot in Venice.

The second test occurred in June 2022 at the We Make Future Show, an Italian digital innovation festival held in Rimini. The operator in Genoa controlled a robot located at the festival venue, again approximately 300 kilometers away. The robot’s task involved receiving and transporting a payload from a person while navigating the theater stage in front of an audience of around 2,000 spectators.

These real-world tests have demonstrated the capabilities and potential applications of the iCub3 avatar system. By enabling remote control and interaction with robots, the system opens up possibilities for various industries and sectors, from art exhibitions to entertainment events. The system’s flexibility and adaptability make it suitable for a range of tasks and environments. It represents a significant step forward in the development of humanoid robots that can effectively operate outside the controlled environment of the research lab. The ongoing advancements made by the research team at the AMI lab at IIT are instrumental in pushing the boundaries of robotics and paving the way for the integration of humanoid robots into our daily lives.

*Note:
1. Source: Coherent Market Insights, Public sources, Desk research
2. We have leveraged AI tools to mine information and compile it