as reported by tech columnist Dan Lyons in Newsweek. What they found was that people need more than just the rote performance of tasks from robots; it helps to have little extra signals that communicate, in human terms, what's going on. My favorite example: "To figure out how to open a door, the robot will simply stand in front of the door, not moving, just scanning the surface with its cameras. To a human, the machine seems to be stuck in one place. But if engineers make the robot’s head move up, down, left, and right while it is scanning, humans understand that the robot is trying to figure out how it works. The movement is unnecessary, but it helps humans recognize what the robot is doing, a trick that animators call 'readability.'" In fact, more than just being unnecessary, that movement likely makes the engineering task even more difficult. But since it helps the robot function in its context around humans, it's beneficial to the overall design - it's precisely where human-centered design and task-focused engineering meet, and hopefully find a way to get along!