%0 Conference Proceedings %T Camera Egomotion Estimation in the ADAS Context %A Diego Alejandro Cheda %A Daniel Ponsa %A Antonio Lopez %B 13th International IEEE Annual Conference on Intelligent Transportation Systems %D 2010 %@ 2153-0009 %@ 978-1-4244-7657-2 %F Diego Alejandro Cheda2010 %O ADAS %O exported from refbase (http://refbase.cvc.uab.es/show.php?record=1425), last updated on Wed, 19 Feb 2014 10:57:45 +0100 %X Camera-based Advanced Driver Assistance Systems (ADAS) have concentrated many research efforts in the last decades. Proposals based on monocular cameras require the knowledge of the camera pose with respect to the environment, in order to reach an efficient and robust performance. A common assumption in such systems is considering the road as planar, and the camera pose with respect to it as approximately known. However, in real situations, the camera pose varies along time due to the vehicle movement, the road slope, and irregularities on the road surface. Thus, the changes in the camera position and orientation (i.e., the egomotion) are critical information that must be estimated at every frame to avoid poor performances. This work focuses on egomotion estimation from a monocular camera under the ADAS context. We review and compare egomotion methods with simulated and real ADAS-like sequences. Basing on the results of our experiments, we show which of the considered nonlinear and linear algorithms have the best performance in this domain. %U http://refbase.cvc.uab.es/files/CPL2010b.pdf %U http://dx.doi.org/10.1109/ITSC.2010.5625303 %P 1415–1420