AI Helps Robots Read Human Body Language to Be Better Coworkers

The next generation of machines are programmed to understand then react to human movement and expressions, allowing them to work better, safer and productively alongside people.

By Chase Guttman

By Chase Guttman February 16, 2023

For decades, science fiction writers have portrayed robots as sentient beings that relate to humans and live amongst them. In reality, however, robots have existed separate and apart from people — devices in corners or behind walls designed to accomplish tasks for their human beneficiaries instead of with them.

“Robots have been around for a long time … but they were mainly used for automating very repetitive and potentially harmful processes,” said Dr. Bilge Mutlu, professor of computer science, psychology and industrial engineering at the University of Wisconsin-Madison, where he is director of the People and Robots Laboratory

“Robots were really dangerous for humans in [industrial work environments], so they were usually caged off.”

Not anymore. 

Now, machines are moving out of the shadows and working side-by-side with humans in a technological dance that’s bringing more efficiency to factories and many other kinds of workplaces. Using advanced artificial intelligence (AI), modern robots can predict with increasing accuracy what their human teammates will do next, then intervene to assist them.

Related

AI Helps Musicians and Scientists Finish Beethoven’s 10th Symphony

Indeed, robots over the past two decades have begun to break free from the cages to which Mutlu referred, allowing them to do more dynamic and synergetic work with their human counterparts. What’s making it possible for them to do so is this: The next generation of machines can read body language and facial expressions to anticipate and react to employee behavior, boosting the productivity and safety of human workers without replacing them.

“Since collaborative robotics was introduced to the market, robots have become more flexible and safer,” explained Mutlu, who said their intelligence, size and portability could allow robots to be used for a multitude of tasks in the modern workplace.

Those tasks are only now coming into focus, according to Brad Porter, founder of Collaborative Robotics and former vice president of robotics at Amazon. Putting humans and machines together in close quarters, he said, will have a profound impact on a variety of different industries.

“If we want to see robots taking more of the burden of work in society,” Porter noted, “we need those robots to move out of structured spaces and move into logistics, hospitality, healthcare, municipal services and transportation.” 

From Automation to Collaboration

Collaborative Robotics is one of a handful of companies building autonomous technology that can work side-by-side with humans. Some of its machines can spare workers from handling sharp objects or carrying heavy materials while orchestrating hand-offs on a production line.

“We set out to build a robot that needs to work in and around humans, so you want it to have as rich a model for what a human might be doing as you can give it so it can anticipate what they will do next,” Porter said.

Because some actions are still too dexterous, skilled or varied to be fully automated, teamwork between human and robotic workers is essential. 

“Any kind of process that needs a lot of human expertise — these are very hard to automate,” Dr. Mutlu said. 

“The key is bringing these robots into processes, getting them to do what they are good at while leaving people to do what they are good at, and creating a fluent workflow between the robot and human workers. We are building more collaborative systems that can capitalize on human expertise while automating the menial, dangerous and repetitive parts of a task.”

Building collaborative systems requires equipping robots with sophisticated cameras and sensors to collect timely and actionable data on the activities of their living, breathing colleagues.

Related

Running Business-Critical Apps Across Private and Public Clouds

“I think it is really helpful to understand body language with respect to task actions — they are getting ready to do this or that,” Dr. Mutlu said.

“It can observe a collaborator, know what their task is, predict what they’re about to do and plan its actions accordingly. You can see where a person is looking and know where their attention is, so you can time the robot’s behaviors better. This can create much more fluid collaborative tasks that are safer, more cost-effective and efficient.”

Learning Something New Every Day

AI and machine learning (ML) are crucial to making sense of the massive quantity of images that warehouse robots ingest in real time.

“It’s a combination of sensing and machine learning with that sensor data,” Porter said.

“Machine learning is being used to determine the pose or motion of a human or their facial expressions or gaze tracking.”

Thanks to that machine learning, robots are constantly improving their ability to understand their co-workers and complete both autonomous and collaborative tasks.

“There are systems that continuously learn and adapt, and the new observations that they detect can be used to further their training and optimization,” Dr. Mutlu said. 

“You can also take the robot and move its arm, demonstrate the task and let it repeat that task so it will learn from that demonstration. Or you might use a video of a worker doing the task, captured from the robot’s camera. If you provide enough data to the robot, it can recognize different aspects of a person’s actions and know what it needs to do. That is when you use AI and machine learning methods to train the robot.”

Related

Cloud and Edge Computing Get Better Together

Behind these important, incremental improvements is cloud computing.

“Cloud-based services are unavoidable,” Dr. Mutlu said. “They’re really becoming integral to this kind of work.”

‘A Little Like Star Wars

The robotic evolution continues to transform industries not only by supercharging efficiency but also by protecting workers’ jobs and wellbeing. With their newfound ability to understand and react to cues from their human colleagues, machines are paving the way for a more collaborative future that extends far beyond factories and warehouses.

“Down the road, their capabilities will extend to completely open-ended environments where robots can help people without knowing who the person is or what their task is,” Dr. Mutlu said.

When that happens, robotic enthusiasts insist that everybody’s lives will be easier. 

“We are going to increasingly see robots be a part of our everyday lives, working in and around us, picking up trash along the roadside or collecting shopping carts in a grocery store parking lot,” Porter said. 

“Increasingly, these capabilities are becoming more and more available. I sometimes think of the future as being a little like Star Wars, where you see humans and robots walking alongside each other and it seems very natural.”

Editor’s note: Learn more about Nutanix GPT-in-a-Box, a full-stack software-defined AI-ready platform designed to simplify and jump-start your initiatives from edge to core. More details in this blog post The AI-Ready Stack: Nutanix Simplifies Your AI Innovation Learning Curve and in the Nutanix Bible.

Chase Guttman is a technology writer. He’s also an award-winning travel photographer, Emmy-winning drone cinematographer, author, lecturer and instructor. His book, The Handbook of Drone Photography, was one of the first written on the topic and received critical acclaim. Find him at chaseguttman.com or @chaseguttman

© 2023 Nutanix, Inc. All rights reserved. For additional legal information, please go here.