1. Technical Field
The present invention generally relates to touch technologies and particularly to an apparatus and a method for acquiring an object image of a pointer.
2. Description of the Related Art
FIG. 1 is a perspective view of an optical touch system in prior art. Referring to FIG. 1, the optical touch system 100 includes an apparatus 101 for acquiring an object image of a pointer, a panel 104 and reflectors 112 through 116. The apparatus 101 includes a processing circuit 110 and image sensors 106 and 108. Both of the image sensors 106 and 108 are used for acquiring an image of a touch surface 118 of the panel 104. The processing circuit 110 is electrically coupled to the image sensors 106 and 108, so as to receive images sensed by the two image sensors 106 and 108. In this embodiment, the touch surface 118 is a quadrilateral area, which is preferably shaped in rectangle. The reflectors 112 through 116 are all used for reflecting light to the touch surface 118, and all of the reflectors do not form any mirror image of the touch surface 118. When a pointer 102 approaches the touch surface 118, the processing circuit 110 acquires a position of the pointer 102 according to the images sensed by the two image sensors 106 and 108.
FIG. 2 is a schematic view of the optical touch system 100 with single touch. In FIGS. 1 and 2, the objects of uniform labels represent the same element. As shown in FIG. 2, the image sensor 106 can sense the pointer 102 along the sensing path 202, and the image sensor 108 can sense the pointer 102 along the sensing path 204. Thus, as long as the processing circuit 110 can acquire a straight line equation of the sensing path 202 according to the image sensed by the image sensor 106 and can acquire a straight line equation of the sensing path 204 according to the image sensed by the image sensor 108, the processing circuit 110 can further calculate a crossing point of the sensing paths 202 and 204 and can further calculate the coordinates of the pointer 102 according to the crossing point.
Before calculating the coordinates of the pointer 102, the processing circuit 110 needs to acquire an imaging range of the pointer 102 in an image sensing window of the image sensor 106 from the image sensed by the image sensor 106 (detailed as follows). That is, the processing circuit 110 needs to acquire an object image of the pointer 102 from the image sensed by the image sensor 106, so as to further acquire the straight line equation of the sensing path 202. Meanwhile, the processing circuit 110 still needs to acquire an imaging range of the pointer 102 in an image sensing window of the image sensor 108 from the image sensed by the image sensor 108. That is, the processing circuit 110 needs to acquire an object image of the pointer 102 from the image sensed by the image sensor 108, so as to further acquire the straight line equation of the sensing path 204. Further explanation is given as follows.
Take an operation of the processing circuit 110 and the image sensor 106 as an example. Before the pointer 102 approaches the touch surface 118, the processing circuit 110 senses the touch surface 118 through the image sensor 106, so as to obtain an image without any image of the pointer 102 and to regard the obtained image as a background image. Afterwards, the processing circuit 110 acquires the brightness values of the N brightest pixels in each pixel column of the background image, and the processing circuit 110 calculates an average brightness value or a total brightness value of the N brightest pixels in each pixel column to obtain a brightness distribution profile, wherein N is a natural number. Since the brightness of the background is usually non-homogeneous, the brightness distribution profile is presented as a curve segment. FIG. 3 is an exemplary brightness distribution profile acquired from the background image, each dot of the curve segment shown in FIG. 3 represents a column pixel brightness value of the background image.
When the pointer 102 approaches the touch surface 118, the processing circuit 110 acquires an image containing an object image of the pointer 102 through the image sensor 106. FIG. 4 is a schematic view of an image sensed by the image sensor. In FIG. 4, label 400 represents an image sensing window of the image sensor 106. A white zone labeled by 402 is a bright zone with a higher brightness in the image, and the bright zone is formed by the light reflected by the reflectors 114 and 116. The bright zone 402 is the main sensing area. Label 404 represents a dark stripe formed by the pointer 102. That is so called the object image of the pointer 102.
When the object image containing an image of the pointer 102 is acquired, the processing circuit 110 regards the acquired image as a sensed image, and the processing circuit 110 further acquires a brightness distribution profile of the sensed image by the same way of acquiring the above brightness distribution profile. FIG. 5 shows the aforementioned another exemplary brightness distribution profile. In FIG. 5, a curve segment labeled by 502 represents the brightness distribution profile acquired from the sensed image, and each dot of the curve segment represents a column pixel brightness value of the sensed image. In FIG. 5, the range labeled by W1 is a range with low brightness formed by the light shading of the pointer 102. The curve labeled by 504 represents a threshold value, and the threshold value 504 is acquired according to a predetermined percentage of the brightness distribution profile (as shown in FIG. 3) acquired from the background image.
Referring to FIG. 5, after the brightness distribution profile 502 is acquired, the processing circuit 110 compares the brightness distribution profile 502 and the threshold value 504, so as to determine a column pixel distribution range corresponding to the part (in the range labeled by W1) whose brightness values are less than the threshold value 504 in the brightness distribution profile 502 and to regard the column pixel distribution range as an imaging range of the pointer 102 in the image sensing window 400 of the image sensor 106. In other words, the processing circuit 110 acquires the image information in the imaging range W1 and regards the acquired image information as an object image of the pointer 102. Thus, the processing circuit 110 can further acquire the straight line equation of the sensing path 202 according to the imaging range W1. For example, the processing circuit 110 can calculate a gravity of the imaging range W1, so as to further acquire the straight line equation of the sensing path 202. Similarly, the operation of the processing circuit 110 and the image sensor 108 can be the same with the operation of the processing circuit 110 and the image sensor 106. Thus, the straight line equation of the sensing path 204 can also be acquired.
However, many problems appear when the optical touch system 100 is used in multi-touch mode. Take an operation of the processing circuit 110 and the image sensor 106 as an example. When two pointers 102 approach the touch surface 118, and the two pointers 102 are adjacent to each other, the processing circuit 110 can acquire a brightness distribution profile from the sensed image. FIG. 6 shows the said acquired brightness distribution profile. In FIG. 6, a curve segment labeled by 602 represents the brightness distribution profile acquired from the sensed image, and each dot of the curve segment represents a column pixel brightness value of the sensed image. In FIG. 6, the range labeled by W2 is a range with low brightness formed by the light shading of the two pointers 102. The curve segment labeled by 504 represents a threshold value, and the threshold value 504 is acquired according to a predetermined percentage of the brightness distribution profile acquired from the background image.
It can be learned from FIG. 6 that the processing circuit 110 may regard the two pointers 102 as one pointer if the threshold value 504 is too high. Thus, the processing circuit 110 can not further calculate the coordinates of the two pointers 102.