1. Field of the Invention
The present invention relates to an optical touch device and a related locating method, and more particularly, to an optical touch device with preferred coordinates locating accuracy and a related image detecting component and a related locating method.
2. Description of the Prior Art
Nowadays, a touch operation has been one of necessary functions of the consumer electronic devices. A touch device is an important component to achieve the touch operation. Generally, familiar types of the touch device include, for example, a resistive touch device, a capacitive touch device and an optical touch device. The electronic devices can be equipped with various touch devices in accordance with the various demands.
Please refer to FIG. 1. FIG. 1 is a structural diagram of an optical touch device in the prior art. The conventional optical touch device 100 includes a light guide module 110, a light source module 120 and an image detecting module 130. The light guide module 110 includes three light reflecting bars 112a, 112b and 112c arranged along three sides of a rectangle track. The light reflecting bar 112a faces toward the light reflecting bar 112c, and the light reflecting bar 112b is connected between the light guide bar 112a and the light reflecting bar 112c. The area inside the rectangle track defines a sensory area 114. The light source module 120 includes two light emitting components 122a and 122b. The light emitting component 122a is disposed on an end of the light guide bar 112a opposite to the light guide bar 112b, and the light emitting component 122b is disposed on an end of the light guide bar 112c opposite to the light guide bar 112b. The light source module 120 is configured for emitting light to the three light reflecting bars 112a, 112b and 112c. The three light reflecting bars 112a, 112b and 112c reflect the light from the light source module 120 to irradiate the sensory area 114. The image detecting module 130 includes two image detecting components 132a and 132b. The image detecting component 132a is disposed on the end of the light reflecting bar 112a opposite to the light reflecting bar 112b, and the image detecting component 132b is disposed on the end of the light reflecting bar 112c opposite to the light reflecting bar 112b. Each of the two image detecting components 132a and 132b includes a plurality of pixels 135 arranged along a straight direction. The pixels 135 detect an object (such ad a touch point) located inside the sensory area 114, and a position (coordinates) of the object can be calculated according to the detected information.
A field of view of the image detecting component 132a covers the light reflecting bars 112b and 112c. It is to say, the pixels 135 of the image detecting component 132a detect the light reflecting bars 112b and 112c. When the object is located inside the sensory area 114, and a darkness point formed by the object is located at the light reflecting bar 112b, the light reflecting bar 112c, or a connecting portion of the light reflecting bar 112b and the light reflecting bar 112c, the darkness point can be detected by a part of the pixels 135 of the image detecting component 132a. Similarly, a field of view of the image detecting component 132b covers the light reflecting bars 112a and 112b. That is, the pixels 135 of the image detecting component 132a detect the light reflecting bars 112a and 112b. When the object is located inside the sensory area 114, and the darkness point formed by the object is located at the light reflecting bar 112a, the light reflecting bar 112b, or a connecting portion of the light reflecting bar 112a and the light reflecting bar 112b, the darkness point can be detected by a part of the pixels 135 of the image detecting component 132b. 
Generally, the conventional optical touch device 100 utilizes a medium center calculating method or a gravity center calculating method to calculate the imaging position of the darkness point formed by the object, so as to determine the position of the object. However, positions of the darkness points formed by all objects inside the sensory area 114 cannot be calculated accurately by the medium center calculating method or the gravity center calculating method. For example, in the case of utilizing the medium center calculating method to calculate the imaging position of the darkness point, the sensory area 114 of the conventional optical touch device 100 has an insensitive area 114a. As the object is just located inside the insensitive area 114a, the calculated position of the darkness point by the medium center calculating method generates an error due to a large offset angle of the light. Furthermore, as the imaging position of the darkness point is calculated by the gravity center calculating method, the calculated position of the darkness point by the gravity center calculating method generates an error when the darkness point formed by the object is just located at the connecting portion of two adjacent light reflecting bars.
Please refer to FIG. 2. FIG. 2 is a diagram of utilizing the medium center calculating method to calculate the darkness point formed by the object inside the insensitive area shown in FIG. 1. The image detecting component 132b is described as an example. When the medium center calculating method is utilized to calculate a position of the darkness point A1 formed by the object A, which is located inside the insensitive area 114a, the n-th pixel 135n to the r-th pixel 135r of the image detecting component 132b detect the darkness point A1 imaged on the light reflecting bar 112a by the object A. The calculated center position of the darkness point A1 by the medium center calculating method is equal to (n+r)/2, which means the center of the darkness point A1 corresponds to the (n+r)/2-th pixel 135m. However, a straight line L passing through the center of the object A and the center of the darkness point A1 is connected to the pixel 135m′, the correct center of the darkness point A1 should correspond to the pixel 135m′ rather than the pixel 135m. Similarly, the image detecting component 132a has the same drawback. Therefore, when the position of the darkness point formed by the object inside the insensitive area 114a is calculated by the medium center calculating method, the calculated position of the darkness point includes an error.
Please refer to FIG. 3. FIG. 3 is a diagram of utilizing the gravity center calculating method to calculate the darkness point formed by the object inside the sensory area shown in FIG. 1. The image detecting component 132a is described as an example to utilize the gravity center calculating method to calculate the darkness point B1 formed by the object B inside the sensory area 114. The x-th pixel 135x to the y-th pixel 135y of the image detecting component 132a can detect the darkness point B1 formed by the object B. A calculating formula of the gravity center calculating method is as follow:
  Cg  =                    ∑                  w          =          x                y            ⁢                                                            bg              ⁡                              [                w                ]                                      -                          img              ⁡                              [                w                ]                                                              ×        w                            ∑                  w          =          x                y            ⁢                                            bg            ⁡                          [              w              ]                                -                      img            ⁡                          [              w              ]                                                  
In this formula, w represents the w-th pixel, bg [w] represents a background brightness of the w-th pixel, img [w] represents an image brightness of the image detected by the w-th pixel, and Cg represents the calculated gravity center position of the darkness point B1 formed by the object B. As shown in FIG. 3, when the darkness point B1 formed by the object B is located at the connecting portion of the light reflecting bar 112b and the light reflecting bar 112c, because the conventional optical touch device 100 has weak irradiation at the connecting portion of the light reflecting bar 112b and the light reflecting bar 112c, the background brightness and the image brightness detected by the pixels 135 of the image detecting components 132a are inaccurate. Therefore, the calculated position of the darkness point B1 is different from the actual position of the darkness point B1.
Besides, the sensory area 114 can be divided into a plurality of sensory subareas, and a boundary of the sensory subareas preferably is not disposed on the connecting portion of the adjacent light reflecting bars 112b and 112c (such as being located at the light reflecting bar 112b or the light reflecting bar 112c). When the darkness point B1 formed by the object B is imaged on the boundary of the subareas, the actual imaging position of the darkness point B1 has an apparent error no matter what algorithm (the gravity center calculating method or the medium center calculating method) is utilized to calculate the touch position of the object B by the image detecting component 132a. Therefore, the conventional locating method can not accurately calculate the position of the darkness point formed by the object located inside any part of the sensory area 114. Thus, the conventional optical touch device can not determine the position of the object accurately.