"We are very interested in how human-powered vehicles and robots can coexist," says MIT Director Daniela Rus and co-author of the newspaper. "This is a major challenge for the field of autonomy and a question that applies not only to robots on the roads, but in general to any kind of human-machine interaction." One day, this type of work could help people to work smoother with robots, for example, in the factory or in a hospital room.
But first the game theory. The research is based on an approach that is more commonly used in robotics and machine learning: Games allow machines to be "taught" to make decisions with incomplete knowledge. Players ̵
Yet uncertainty is a challenge. "Ultimately, one of the challenges of self-propelling is that you try to predict human behavior, and human behavior tends not to fall into rational agent models that we have for gamers," said Matthew Johnson-Roberson, Assistant Professor of Engineering at the University of California University of Michigan and co-founder of Refraction AI, a startup building autonomous delivery vehicles. Someone might look as if he were about to unite, but from the corner of his eye he sees a ray of hope and stops. It's very hard to teach a robot to predict such behavior.
Of course, driving situations could become less uncertain if the researchers could gather more information about human driving behavior, I hope I'll do it next. Data on the speed of vehicles, the direction of travel, the change in their position over time – all of which can help traveling robots better understand the functioning of the human mind (and personality). Perhaps, the researchers say, an algorithm derived from more accurate data could improve predictions about human driving by 50 percent, rather than 25 percent.
That could be really difficult, says Michigan professor Johnson-Roberson. "One of the reasons I think the provision will be difficult [autonomous vehicles] is that you have to make those predictions correctly when you're traveling at high speed in dense urban areas," he says. Being able to tell if a driver is a selfish driver within 2 seconds of observation is useful, but a car driving at 25 mph will travel nearly 75 feet in that time. A lot of unfortunate things can happen in 75 feet.
And the fact is that even humans do not always understand people. "People are the way they are, and sometimes they do not focus on driving and make decisions that we can not fully explain," says Wilko Schwarting, MIT graduate, who led the research. Good luck out there, robot.
More Great WIRED Stories