Humans use a tremendous amount of non-verbal cues in their interactions with each other. The direction someone is looking or the lean of a body may indicate whether the person is going to walk to the left of you or the right. Up until now robots were not adept at reading and adapting to these signals. That is in the process of changing and that will enhance robots and humans working together.
Big hulking machines
The first industrial robots where big hulking machines that were dangerous to be around. They were often in safety cages to help prevent humans from being harmed, mostly because of the inattention of the humans. Today robots are smaller in many situations and are being introduced into non-industrial environments. They are being taught to identify and learn human signals, even the subtle ones.
In an article called Robot communication: It’s more than just talk, author Charlie Wood describes a program at MIT’s Interactive Robotics Group, headed by Dr. Julie Shah. In this program “They aim to build robotic systems that can work alongside, and even integrate with, human teams. And that means robots that learn from observation, predict teammate actions, and adjust their behavior accordingly, much like a person would.” Dr. Shah does not think this is too far fetched at all.
Wood reports that they have already tested such a system in a hospital. “…the group tested just such a system last year. After an “apprenticeship” spent watching nurses and doctors, a robotic decision support system succeeded in making patient care suggestions that nurse participants in controlled experiments accepted 90 percent of the time.”
Cues for humans and cues for robots
To improve the ability of robots and humans to work together some robots are being fitted with cues that will help the humans understand the behavior of the robots. Signals we are already familiar with, such as red and green lights, help humans interpret whether robots are safe to be around. In the movie I Robot, the torso of the robot turned red when it was in “evil” mode.
At the same time robots are being taught to study humans. Dr. Shah’s team “…also harnesses machine learning and biophysical modeling to help robots read human body language, and predict where a teammate will move next. For example, tracking a person’s walking speed and head direction reveals which way they’ll turn about two steps early, information we humans only become aware of when a miscalculation ends in the ‘hallway dance.’”
Robot technology is moving more rapidly than most realize. We are not at the point of having robots replace people on a wholesale basis. That has already occurred. The next iteration is having robots as co-workers. You will need to be able to read them as well as they are reading you.
What are you going to do when a robot files a claim of harassment from their human co-workers?