This is an old revision of the document!

Supplementary material for IROS 2014 paper "Interactive Augmented Reality for Understanding and Analyzing Multi-Robot Systems"


Once a multi-robot system is implemented on real hardware and tested in the real world, analyzing its evolution and debugging unexpected behaviors is often a very difficult task. We present a tool for aiding this activity, by visualizing an Augmented Reality overlay on a live video feed acquired by a fixed camera overlooking the robot environment. Such overlay displays live information exposed by each robot, which may be textual (state messages), symbolic (graphs, charts), or, most importantly, spatially-situated; spatially-situated information is related to the environment surrounding the robot itself, such as for example the perceived position of neighboring robots, the perceived extent of obstacles, the path the robot plans to follow. We show that, by directly representing such information on the environment it refers to, our proposal removes a layer of indirection and significantly eases the process of understanding complex multi-robot systems. We describe how the system is implemented, discuss application examples in different scenarios, and provide supplementary material including demonstration videos and a functional implementation.

Example videos