Bayesian perspective for the registration of multiple 3D views

X. Mateo, X. Orriols, X. Binefa

    Research output: Contribution to journalArticleResearchpeer-review

    12 Citations (Scopus)


    The registration of multiple 3D structures in order to obtain a full-side representation of a scene is a long-time studied subject. Even if the multiple pairwise registrations are almost correct, usually the concatenation of them along a cycle produces a non-satisfactory result at the end of the process due to the accumulation of the small errors. Obviously, the situation can still be worse if, in addition, we have incorrect pairwise correspondences between the views. In this paper, we embed the problem of global multiple views registration into a Bayesian framework, by means of an Expectation-Maximization (EM) algorithm, where pairwise correspondences are treated as missing data and, therefore, inferred through a maximum a posteriori (MAP) process. The presented formulation simultaneously considers uncertainty on pairwise correspondences and noise, allowing a final result which outperforms, in terms of accuracy and robustness, other state-of-the-art algorithms. Experimental results show a reliability analysis of the presented algorithm with respect to the percentage of a priori incorrect correspondences and their consequent effect on the global registration estimation. This analysis compares current state-of-the-art global registration methods with our formulation revealing that the introduction of a Bayesian formulation allows reaching configurations with a lower minimum of the global cost function. © 2013 Elsevier Inc. All rights reserved.
    Original languageEnglish
    Pages (from-to)84-96
    JournalComputer Vision and Image Understanding
    Publication statusPublished - 1 Jan 2014


    • 3D registration
    • Expectation Maximization algorithm
    • Pairwise correspondences


    Dive into the research topics of 'Bayesian perspective for the registration of multiple 3D views'. Together they form a unique fingerprint.

    Cite this