A simple method for multiple camera calibration based on a novel geometric derivation is presented. The main advantage of this method is that it uses only three points in the world coordinate system to achieve the calibration. Rotation matrix and translation vector for each camera coordinate system are obtained via the given distance between the vertices of the marker triangle formed by the three points. Therefore, the different views from the different cameras can be converted into one top view in the world coordinate system. Eventually, the different trajectories traced by certain tracked agents on the floor plane can be obtained from different viewpoints and can be matched in a joint scene plane. © 2007 Springer-Verlag Berlin Heidelberg.
|Journal||Advances in Soft Computing|
|Publication status||Published - 1 Dec 2007|