1. Field of the Invention
The present invention relates to a monitoring system and more particularly to a monitoring system which can be mounted on a vehicle and can adjust an exposure amount of a camera which captures an image of surroundings of a subject vehicle.
2. Description of the Related Art
In recent years, a development of technologies has been in progress in which a three-dimensional object existing on a periphery of a vehicle such as a passenger vehicle is detected by analyzing an image captured by a CCD (Charge Coupled Device) camera (refer, for example, JP-A-07-225892). These technologies are applied, for example, to technologies for safety driving of vehicles in which a possibility of collision of the vehicle with the detected three-dimensional object is judged, so as to activate an audible warning system to give an alarm to a driver or activate an automatic steering or automatic brake control to avoid the possible collision.
In the detection of the three-dimensional object, it is important to accurately detect a preceding vehicle, tail lamps thereof, an oncoming vehicle, a pedestrian walking on the road and other obstacles such as vehicles parked along the side of the road. In addition, an exposure of the camera has to be appropriately implemented so as to capture an image which allows an accurate detection of the three-dimensional objects in order to detect these three-dimensional objects on the road from an image captured by the camera.
Cameras such as the CCD camera normally have a function to automatically implement the appropriate exposure adjustment. In addition, a system for adjusting the exposure of a camera has been proposed to cope with a case where an image capturing range of the camera is darkened or a brightness in the image capturing range largely changes, which the vehicle is driven during night-time or through a tunnel, or which the vehicle enters and exits from the tunnel (refer, for example, JP-A-07-081459 and JP-A-2005-148308).
Incidentally, according to functions that the CCD camera normally possesses and functions of the systems disclosed in JP-A-07-081459 and JP-A-2005-148308, the appropriate exposure adjustment is implemented for the brightness in the image capturing range of the camera or in an area set within an image captured by the camera. For example, an automatic control is performed on a subject which is following a preceding vehicle based on information available as a result of detecting the preceding vehicle by a camera. Then, light from a headlamp of an oncoming vehicle enters the camera of the subject vehicle where the oncoming vehicle comes to appear on a right-hand side (or a left-hand side in the United States of America and the like) of the preceding vehicle when driving at night or while driving through a tunnel.
As this occurs, even though the exposure of the camera is performed as above, the light of the headlamp of the oncoming vehicle brightly and widely expands on an image captured by the camera to thereby interrupt the capture of the right-hand side of the preceding vehicle by the camera, whereby there may be caused an occasion where the preceding vehicle is not be detected or a large detection error is caused to appear the oncoming vehicle. In addition, although there is a case where the preceding vehicle is detected based on positional information on tail lamps of the preceding vehicle, as is shown in FIG. 12, which will be described later, the light of the tail lamp of the preceding vehicle and the light of the headlamp of the oncoming vehicle are captured as an integral highly bright area. As a result, there is a risk that the tail lamp of the preceding vehicle is unable to be captured accurately.
Furthermore, according to a system disclosed in JP-A-07-225892, two cameras are used to capture an image of surroundings in front of a subject vehicle in a stereoscopic fashion to calculate a distance to a three-dimensional object existing on a periphery of the front of the subject vehicle by stereo matching to thereby detect the three-dimensional object. As is indicated by a shaded portion in FIG. 14, although the right-hand side camera is able to capture of the subject vehicle MC in the system, there is produced a range which lies invisible to the left-hand side camera due to the visibility of the range from the left-hand side camera being interrupted by the preceding vehicle.
Then, when an oncoming vehicle Vonc exists in the range, as is shown in FIG. 15, light Lhead of a headlamp of the oncoming vehicle Vonc is captured by the right-hand side camera but is not captured by the left-hand side camera. Due to this, the luminance of an image captured by the right-hand side camera becomes higher as a whole around the light Lhead of the headlamp than that of an image captured by the left-hand side camera, resulting in a case where stereo matching may not be able to be implemented accurately which is implemented based on the correlation between the luminances of the two captured images.
In addition, when attempting to implement stereo matching based on the two captured images, there may occur an occasion where stereo matching cannot be implemented accurately due to the influence of the light Lhead of the headlamp of the oncoming vehicle Vonc in a right-hand side portion of the preceding vehicle Vah or in the position of the right-hand side tail lamp of the preceding vehicle Vah, whereby a large detection error has to be generated in a distance from the subject vehicle MC to the preceding vehicle Vah which is detected based on the stereo matching.