CMU researchers are teaching a computer to understand body language
Robots might soon be able to tell if you flash a peace sign or flip the bird.
Researchers at Carnegie Mellon University have developed a computer that can understand and track body movements in real time, including individual fingers.
The computer could allow robots to read body language, helping them better perceive your mood and interact with you in social settings, said Yaser Sheikh, an associate professor of robotics leading the project. Robots could better understand what we mean when we point at something or do other hand gestures like putting our finger to lips to tell someone to keep quiet.
"We communicate almost as much with the movement of our bodies as we do with our voice," Sheikh said in a statement. "But computers are more or less blind to it."
The university believes it is the first time a computer can track individual fingers. The computer can also track several people at once.
Self-driving cars could detect when a person might step off the curb to cross a street by monitoring the person's body language. Coaches will be able to track every player on the field, court or ice and tell what they are doing with their arms, legs, heads and hands.
The researchers released the code publicly to allow other scientists to conduct more research and find more applications. Sheikh said more than 20 companies and commercial groups, including automotive companies, have expressed interest in licensing the technology.
The technology was developed using CMU's Panoptic Studio, a two-story dome that has about 500 video cameras. The studio can capture humans interacting and analyze the data.
CMU's Panoptic Studio
Aaron Aupperlee is a Tribune-Review staff writer. Reach him at firstname.lastname@example.org, 412-336-8448 or via Twitter @tinynotebook.