CLEAR: A Consistent Lifting, Embedding, and Alignment Rectification Algorithm for Multi-Agent Data Association

A fundamental challenge in many robotics applications is to correctly synchronize and fuse observations across a team of sensors and agents to form a collective representation of the environment. A recent paper by members of the DCIST alliance develops a framework that enables the simultaneous fusion of sub-representations across multiple agents. This development was motivated by two shortcomings in the existing solutions: 1) high computational complexity of the optimization-based methods; 2) violation of the “cycle consistency” principle in the presence of noise, which indicates the existence of an error in the alignment of representations. Furthermore, in the context of fusion, solutions that violate this principle inherently cannot prescribe a valid universal representation. Any naive post-hoc attempt to make such solutions cycle consistent can drastically reduce the fusion accuracy.

The proposed approach is based on reformulating the alignment problem in a graph-theoretic framework, where a novel spectral graph clustering technique is used to reconstruct the universal domain from the spectrum of the graph Laplacian matrix. Numerical experimentals verified that the proposed algorithm provides consistent solutions even in challenging high-noise regimes, outperforms existing cycle-consistent algorithms in terms of precision-recall, and has a lower computational complexity compared to the state-of-the-art optimization-based methods. The resulting general framework is expected to provide significant improvement in the accuracy and efficiency of DCSIT tasks such as multi-agent learning, inference, and planning.

Source: K. Fathian, P. Lusk, Y. Tian, J. P. How, ”CLEAR: A Consistent Lifting, Embedding, and Alignment Rectification Algorithm for Multi-Agent Data Association”, Technical Report (arXiv) – submitted to Robotics: Science and Systems 2019.

Task: RA1.B1: Distributed Learning, Inference & Planning

Points of Contact: Kaveh Fathian, Jonathan How