The robot of feelings, developed by the Sibley School of Mechanical and Aerospace Engineering (MAE), is capable of expressing emotions from the changes that occur in its external surface.
The robot of feelings arrives to put a point and separate in the relationship between humans and robots. A new prototype is now announced as the future of emotions within robotics.
Emotions come through the skin
The new robot of feelings has been the fruit of the work of Guy Hoffman and his team. This assistant professor and member of the Mills Family School at the Sibley School of Mechanical and Aerospace Engineering (MAE), always thought that robots should not be just a model created by man or a copy of it.
For this reason, together with students from its Laboratory of Collaboration and Human-Robot Companionship, he sought to develop a prototype of a robot capable of interacting with our own language and taking advantage of our instincts.
The result has been the robot of feelings. Capable of expressing emotions through changes in its external surface, this design has a skin covered with textured units whose shapes are controlled by fluidic actuators.
The animal world: the source of inspiration
Hoffman’s work has found its source of inspiration in the animal world. The relationship between humans and animals has been the reflex to develop this robot of feelings. And, this study started with the understanding of the non-verbal signals that animals emit and our knowledge of them.
Therefore, the design of this robot of feelings also has two animal forms: chicken skin and barbs. The units of action for these forms are integrated into texture modules, with fluidic chambers that connect prominences of the same type.
Likewise, the team led by Hoffman tested two different performance control systems with the noise level and the size minimization as a driving factor in both designs.
An advance in the future
Although for the moment there is no specific application for the robot of feelings, the fact that the skin changes before different textures assigned to its emotional state are a significant first step in advance between the relationship between humans and robots.
The facial expressions and gestures that the current robots are capable of carrying out will be complemented with this new advance of the future where the world of feelings will make its entrance.
But first we will have to face a series of challenges, among which are the development of a technology capable of fitting into an autonomous robot, regardless of the form adopted, or that the same technology is capable of improving its response to changed heartfelt emotions of the robot.