Many vision based advanced driver assistance systems (ADAS) algorithms require information about the orientation, in particular, pitch and roll angles of the vehicle-mounted camera to the road surface. For instance, forward collision warning (FCW) systems determine the distance between the ego-vehicle and a preceding vehicle. In order to calculate this distance the orientation of the ego-vehicle to the road surface has to be known. If the camera orientation is not known sufficiently, FCW algorithms generate wrong output signals which can distract the driver or cause a risky steering or braking reaction.
It is usual practice to determine a static camera orientation, an extrinsic calibration, for stationary vehicles with the help of a calibration pattern. While a car moves, the orientation of the camera to the road surface is no longer static but time-dependent. Braking and acceleration maneuvers as well as road surface irregularities cause fast and notable camera orientation changes. Furthermore, a long-time orientation offset, which differs from the static calibration, can be caused by additional load.
The orientation of the camera to the road surface can be described by a homography that represents a projective transformation between two different images of the same plane, which is here the road surface, captured from two different camera positions. The extraction of angle information from a homography matrix is possible but not recommended due to parameter ambiguities.