1. Field of the Invention
The present invention relates to an autofocusing apparatus of a camera and an autofocusing method thereof, and more specifically, an autofocusing apparatus, which divides the central region of a picture or a window setting region into N regions to adjust the autofocus for each region so that a time required for autofocusing can be shortened, and an autofocusing method thereof.
2. Description of the Related Art
Recently, as the information technology rapidly develops, it is required to develop a composite mobile communication terminal having various functions added therein as well as a mobile communication terminal transmitting only a voice.
Accordingly, such a composite mobile communication terminal, in which a function of receiving and transmitting an image and a function of receiving and transmitting a voice are simultaneously provided, is being implemented. As the composite mobile communication terminal, there is provided a camera phone having a digital camera function implemented therein.
When a user wants to keep a certain sight, the user can take a picture by using a camera phone and store it therein.
The taken picture can be transmitted wirelessly to another mobile communication terminal. Further, the picture can be output on the screen of a PC or can be stored in the PC.
Recently, mobile communication terminals (so-called TV phone) have been developed, which can receive a TV broadcasting program to output, can download Internet information through the connection to the Internet, and can display a moving image. Further, a next-generation mobile communication terminal is being developed, which can perform all the above-described functions.
The construction of a general camera phone is composed of a camera module which takes a picture, a transmission module which transmits any one of a voice and image, and a reception module which receives any one of a voice and image.
The camera module includes a lens sub system and an image processing sub system.
The lens sub system includes a lens section composed of a zoom lens and focus lens, an actuator for driving the zoom or focus lens of the lens section, and an actuator driver.
The image processing sub system includes an image sensor and ISP, an autofocus digital signal processor (hereinafter, referred to as DSP) and the like.
The lens sub system adjusts the focus on an external sight to be taken, and causes light to be incident on the image sensor, the light being incident on a specific region of which the range is determined from the external sight to be taken.
The image sensor of the image processing sub system, composed of photo-cells in which electric charges are stored as a light source is incident during a specific absorption period, converts the stored electric charges into a digital value (pixel value) to output.
The ISP of the image processing sub system compresses the digital values with respect to the secured pixels, performs image processing such as scaling image enhancement, and transmits the processed digital value to a mobile phone main body.
In this case, the lens sub system adjusts the focus of a lens in order to take a clear image. At this time, an autofocusing apparatus, provided in a general camera or digital camera, is used as it is. The brief description thereof will be as follows.
In the auto-focusing apparatus of a general camera or digital camera, when a user sets a composition with respect to an object to be photographed and presses a release button, the focus is automatically adjusted so that photographing is performed.
Such an auto-focusing apparatus is roughly divided into an active type and passive type.
The active-type auto-focusing apparatus emits infrared rays or ultrasonic waves toward an object, and detects light or wave, which is reflected and incident from the object, so as to measure the distance with the object.
The passive-type auto-focusing apparatus which does not have a light emitting section receives light coming from an object through a lens section, and determines the distance with the object by using a brightness/darkness difference of the object.
In other words, the passive-type auto-focusing apparatus detects a high frequency signal, which is proportional to contrast obtained when a brightness signal passes through a high pass filter, for each frame among image signals coming from an image sensor. The obtained contrast is compared with the contrast of the previous frame, so that a focus lens is moved in the direction where the contrast becomes large and the rotation of the focus lens is stopped at a spot where contrast is the largest. Then, the focus is automatically adjusted.
In a general autofocus camera module, an image received through a CCD (Charged Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor is processed by an ISP, and a focus value calculated through an edge obtained when the processed image is passed through a high pass filter (HPF) is extracted for each picture so as to be transmitted to a central processing unit (CPU). At this time, the CPU determines the moving direction and distance of the focus lens on the basis of the calculated focus value and gives an instruction to an actuator driver. Then, an actuator is driven to move the lens, so that the focus is automatically adjusted.
FIG. 1 is a diagram showing a window 101 within a picture 100. As shown in FIG. 1, the central portion of a screen is generally set to the window 101. This is because most of users taking a picture pay attention to the central portion of a screen.
Further, the start and end positions of the window 101 from the autofocus DSP are transmitted to set the window 101 within the picture 100, and the high pass filter outputs of the window 101 set in such a manner are accumulated by an integrator.
The accumulated focus value becomes a reference for adjusting the focus in a camera module. In the case of a still image, a lens is moved to adjust the focus. Even in the same image, when the camera is in complete focus, a high focus value is obtained. Further, when the camera is not in focus, a low focus value is obtained. In general, the focus of a camera is adjusted on the basis of the center to which most of users pay attention.
The algorithm for finding a focus value is performed by a CPU within an autofocus DSP. The CPU determines which direction to move a lens so as to drive an actuator through an actuator driver.
FIG. 2 is a graph showing a focus value in accordance with a lens moving distance.
Although the same image is input to a camera, a low focus value is obtained as in a spot A, when the camera is not in focus. At this time, the moving direction of the lens is determined at a spot B so as to move the lens in a direction C. When the lens moving in the direction C is passed through a spot E having the maximum focus value, the lens is moved in a direction D, which is reverse to the direction C, and is fixed at the spot E to thereby find the maximum value.
In the related art, the focus value is calculated for each picture. This is because the total value of edge components of the window in which a user show an interest is output by the picture.
Therefore, in the conventional process in which the maximum value is found, the following process is repeated. The focus values of pictures are respectively calculated, the direction is determined on the basis of the calculated focus values, and the lens is moved to the direction.
Accordingly, the faster the frame rate, the shorter the autofocusing time, the frame rate meaning speed in which a picture is shown per a unit time.
Meanwhile, as an image sensor which can be used in a conventional camcorder or digital camera, there are provided a CCD (Charge Coupled Device) or CMOS (Complementary Metal Oxide Semiconductor) sensor.
Here, the CCD image sensor, in which a plurality of microminiature metal electrodes are arranged on a silicon wafer, is composed of multiple optical diodes. When light is applied to the CCD image sensor, optic energy is converted into electricity. The electrical charge generated by the photodiode provided in each pixel is transmitted to an amplifier through a vertical transmission CCD and horizontal transmission CCD by using a high potential difference. Therefore, although the power consumption of the CCD image sensor is large, the CCD image sensor is so strong to noise that the amplification is uniformly performed.
On the other hand, the CMOS image sensor, in which a photodiode and amplifier are installed in each pixel, has lower power consumption and is smaller in size than the CCD image sensor. However, the CMOS image sensor has a low image quality.
Therefore, in a conventional camcorder or digital camera, the CCD image sensor has been frequently used, which is strong to noise and has a high image quality. Since the CCD image sensor has a fast frame rate, in which 50 to 60 VGA (640×480) or SD (720×48) pictures are shown per second, it is possible to find the maximum value within a considerably short time.
However, as the image quality of the CMOS image sensor is improved, the CMOS image sensor, which has low power consumption and is favorable to miniaturization, is increasingly used in mobile phones, smart phones, PDA and the like. Accordingly, a time for finding the maximum focus value, that is, the autofocusing time is lengthened.
In other words, the frame rate of the CMOS image sensor is as slow as 30 pictures per second, and users demand an image quality with higher resolution. Therefore, the frame rate of the CMOS image sensor becomes much slower, so that the autofocusing time is considerably lengthened.