Engineering Microsoft

Microsoft teaches robots how to deal with groups and draw from memory

As reported on Engadget.

BY JON FINGAS

One of Microsoft's Situated Interaction robots

Us humans are good at predicting how people will behave, particularly in groups, but artificial intelligence routines still have trouble dealing with much more than controlled, one-on-one discussions. They’ll be far more flexible if Microsoft’s Situated Interaction project pays off, though. The research initiative has produced sensor-equipped robots that can not only recognize multiple people, but infer their objectives and roles. Office assistants can tell who’s speaking and respond in kind, while a lobby robot can call the elevator if it sees a crowd headed in that direction.

Some of the robots also have a human-like ability to draw from memory, expanding on what we’ve seen from virtual assistants like Cortana or Siri. In addition to knowing your schedule, they can detect your presence and make predictions of availability based on your habits; they’ll know if you haven’t been around in a while, or when you’re likely to wrap up a conversation. One robot will even know that you’re coming because you asked one of its fellow machines for directions. It’s doubtful that you’ll see production versions of these context-aware robots any time soon, but they could lead to a generation of smart devices that are better at coordinating with (and relating to) their human counterparts.