Laser-camera calibration


Hokuyo URG-04-LX & Firefly-MV
In my project I use a simple firefly-mv camera and a laser range finder from Hokuyo. The laser that I use now is UTM-30-LX-EW instead of the previously used URG-04-LX. More details here.These two sensors are placed on a moving robot and they are used in conjuction to detect pedestrians (and other moving objects  in the future). Since we use data from both sensors to extract information about the presence of a pedestrin, we need an accurate way of correlating, both in space and time, those two signals. Solving the time issue is trivial with the use of timestamps which give adequate accuracy for objects moving at relatively low speeds.
But for the space correlation things are more complex. The camera singal is a projection of a 3d plane into 2d and the laser signal is a 1d scan from the 3d world. So if you want to know which points are corresponding you have to know the exact 'difference' between the point of view of the two sensors. That is we need to know the exact transformation matrix between the laser and the camera. This process is called laser-camera calibration. And to do it we are going to use a neat matlab toolbox developed by A. Kassir[3].
Figure1.
Detected corners on the checkerboard
First we begin with intrisincally calibrating the camera[1]. The optics and electronics of the camera are far from perfect, so some distortion is apparent. What we need to find is the distortion coefficients so we can later "repair" each camera image. To do this we use a checkerboard pattern with known dimensions. We use the checkerboard because it's easy to detect the corners and thus compute the plane in which they belong (see Fig 1). For more accurate results we have to move the checkerboard into various positions and orientations (Fig 2). 
Figure2.
The algorithm extracted the various positions of the checkerboard
After the camera is intrinsically calibrated, next comes the extrinsic laser-camera calibration which will give the distance and rotation between the two sensors. For this we have already recorder laser scans for each camera photo grabbed. The laser segment that corresponds to the checkerboard is automatically extracted [3] and the problem left to solve is how to minimize the algebraic distance of the two measurements. Which is a linear problem and can easily be solved with least squares. A non-linear method based on the plane orientation is also used, along with a global optimization step including all the poses of the checkeboard [2]. After we get our results we can superimpose laser scans on images.
Laser segments superimposed on the camera image
The calibration is not so easy two be done and the results sometimes are subpar what is really necessary. One detail I can't stress more is the need for a really firm base to keep the camera and the laser from moving in relation to each other. The good news is that if you have a 3d printer you can easily design and print your own base. We (Mario that is) designed two bases one for the urg-04-lx and one for the utm-30-lx-ew, the camera is firefly-mv for both. You can download them here in ipt (autodesk inventor) and stl format, in case anyone finds them useful (yes I'm refering to you my future self).


[1]. J.-Y. Bouguet. Camera Calibration Toolbox for Matlab, 2009 
[2] Q. Zhang and R. Pless. Extrinsic calibration of a camera and laser range finder (improves camera calibration). In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2004
[3]. A. Kassir and T. Peynot. Reliable Automatic Camera-Laser Calibration (Matlab toolbox), 2010

Stathis Fotiadis 26 July 2012

blog comments powered by Disqus