A robot which is learning the unwritten rules of pedestrian behavior is being developed by Stanford University. Humans have the innate ability to "read" one another. When people walk in a crowed public space such as a sidewalk, an airport terminal, or a shopping mall, they obey a large number of (unwritten) common sense rules and comply with social conventions. For instance, as they consider where to move next, they respect personal space and yield right-of-way. The ability to model these “rules” and use them to understand and predict human motion in complex real world environments is extremely valuable for the next generation of social robots.
The team, part of the Computational Vision and Geometry Lab, has already been working on computer vision algorithms that track and aim to predict pedestrian movements. But the rules are so complex, and subject to so many variations depending on the crowd, the width of the walkway, the time of day, whether there are bikes or strollers involved — well, like any machine learning task, it takes a lot of data to produce a useful result.
Jackrabbot already moves automatically and can navigate without human assistance indoors, and the team members are fine-tuning the robot’s self-navigation capabilities outdoors. The next step in their research is the implementation of “social aspects” of pedestrian navigation such as deciding rights of way on the sidewalk. This work, described in their newest conference papers, has been demonstrated in computer simulations. “We have developed a new algorithm that is able to automatically move the robot with social awareness, and we’re currently integrating that in Jackrabbot,” said Alexandre Alahi, a postdoctoral researcher in the lab. The complexity of the programming and machinery makes the Jackrabbot very expensive. However, the researchers are confident that in a few years, a model may be available for around $500.
Those you have seen Disney robot, Wall-E (pictured). You may find a resemblance. but