The present invention generally relates to parallel image processing apparatuses, and more particularly to a parallel image processing apparatus which detects edge information and is suited for use in an artificial visual device of a robot and the like.
When recognizing a pattern of a graphic information, it is necessary to recognize the feature of the pattern. When making such a feature recognition, it is useful to elucidate the feature extracting mechanism of a living body, so as to design a parallel image processing apparatus which utilizes the nerve cells of a visual nervous system in the living body. In other words, this parallel image processing apparatus uses the visual system of the living body as a model, and is provided with a light receiving layer and a threshold element layer. The light receiving layer includes a large number of light receiving elements for detecting an input image, and the light receiving elements are arranged two-dimensionally in a matrix arrangement. The threshold element layer includes a large number of threshold elements for receiving outputs of the light receiving elements, and the threshold elements are arranged two-dimensionally in a matrix arrangement. Each threshold element generates an output equal to a function of a weighted sum of the received inputs. This parallel image processing apparatus outputs an edge information which corresponds to the input image. A parallel image processing apparatus of this type is proposed in a Japanese Published Patent Application No. 50-34901.
Next, a description will be given of the conventional parallel image processing apparatus and the structure of the threshold element which has the edge detection function in particular. The threshold element receives the outputs of a plurality of light receiving elements. For the sake of convenience, it is assumed that the a photoelectric conversion cell such as a photodiode is used for the light receiving element and an edge detection cell is used for the threshold element.
FIG. 1 shows a photoelectric conversion cell layer 1 as one example of the light receiving layer. The photoelectric conversion cell layer 1 includes a large number of photoelectric conversion cells (light receiving elements) 2 which are arranged two-dimensionally in a matrix arrangement on an image formation plane of an image pickup lens 3 which is provided to pick up an input image. Hence, an image corresponding to the input image is formed on the photoelectric conversion cell layer 1, and each photoelectric conversion cell 2 outputs an electrical signal which is dependent on a light intensity at a two-dimensional coordinate on the photoelectric conversion cell layer 1. The output signals of the photoelectric conversion cells 2 can be treated independently.
FIG. 2 shows an edge detection cell layer (threshold element layer) 4 which is used in combination with the photoelectric conversion cell layer 1. The edge detection cell layer 4 includes a large number of edge detection cells (threshold elements) 5 which are arranged two-dimensionally in a matrix arrangement. Each edge detection cell 5 receives the output signals of a plurality of photoelectric conversion cells 2 and outputs a signal which is equal to a function of a weighted sum of the received signals.
For example, as shown in FIG. 3 on an enlarged scale, each edge detection cell 5 is arranged to receive the output signals of nine photoelectric conversion cells 2 which are arranged in a 3.times.3 matrix and constitute a unit receptive region 6. The nine photoelectric conversion cells 2 constituting the unit receptive region 6 are labeled as cells PD.sub.11, PF.sub.12, PD.sub.13, PD.sub.21, PD.sub.22, PD.sub.23, PD.sub.31, PD.sub.32 and PD.sub.33. As indicated by a hatching in FIGS. 2, 3, 4 and 5A, the cell PD.sub.22 is a center cell which is located at the center of the unit receptive region 6, and the remaining cells PD.sub.11 through PD.sub.21 and PD.sub.23 through PD.sub.33 are peripheral cells which are located at the periphery of the unit receptive region 6. The unit receptive region 6 overlaps an adjacent unit receptive region 6.
The center cell supplies a positive potential to a corresponding edge detection cell when the center cell receives light. On the other hand, the peripheral cell supplies a negative potential to a corresponding edge detection cell when the peripheral cell receives light, For this reason, when a signal processing system for one lie is considered, the output signals of the peripheral cells PD.sub.21 and PD.sub.23 are added in an adder 7 as shown in FIG. 4 and an output signal of the adder 7 is inverted by an inverter 8. An output signal of the inverter 8 is supplied to a corresponding edge detection cell 5 together with a direct signal from the center cell PD.sub.22. An edge detection cell (threshold cell) having such an input characteristic is referred to as a threshold element having a ON centered receptive field. FIG. 5A shows the nine cells PD.sub.11 through PD.sub.33 which are arranged in the 3.times.3 matrix, and FIG. 5B shows a processing according to the ON centered receptive field system when the unit receptive region 6 is constituted by the nine cells PD.sub.11 through PD.sub.33.
A description will be given of the processing for the case where the unit receptive region 6 is constituted by the 3.times.3 matrix arrangement of the cells, by referring to formulas. First, when the output signals of the cells PD.sub.11 through PD.sub.33 are respectively denoted by U.sub.11.sup.O through U.sub.33.sup.O and weighting coefficients with respect to the output signals U.sub.11.sup.O through U.sub.33.sup.O at the time of the input are respectively denoted by C.sub.11 through C.sub.33, an input IN.sub.ij.sup.1 to the edge detection cell 5 can be described as follows. EQU IN.sub.ij.sup.1 =U.sub.11.sup.O C.sub.11 +U.sub.12.sup.O C.sub.12 +. . . +U.sub.22.sup.O C.sub.22 +H.sub.32.sup.O C.sub.32 +U.sub.33.sup.O.sub.C.sub.33
With regard to the weighting coefficients C.sub.11 through C.sub.33, the following relationships stand because the unit receptive region 6 is ON centered, where Ch and Ce satisfy a relationship .vertline.8Ch.vertline.=.vertline.Ce.vertline.. ##EQU1## As a result, the output signal U.sub.ij.sup.1 can be described by the following function. EQU U.sub.ij.sup.1 =.vertline.(1+e)/(1+h)-1.vertline.=(e-h)/(1+h).vertline.
In the above function, e and h are defined as follows. ##EQU2##
When an edge of an image exists within the unit receptive region 6 which is constituted by the 3 .times.3 matrix arrangement of the cells and a ratio of the light quantity U.sub.22.sup.O which is received by the center cell PD.sub.22 to the light quantities U.sub.11.sup.O through U.sub.21.sup.O and U.sub.23.sup.O through U.sub.33.sup.O received by the respective peripheral cells PD.sub.11 through PD.sub.21 and PD.sub.23 through PD.sub.33 is not 1:8, the edge detection cell 5 which corresponds to the unit receptive region 6 outputs the signal U.sub.ij.sup.1. Hence, the edge detection cell 5 has the edge detection capability.
Next, a more detailed description will be given with reference to FIGS. 6A through 6C. One edge detection cell 5 is coupled to the nine photoelectric conversion cells PD.sub.111 through PD.sub.33 which are arranged in the 3.times.3 matrix and correspond to the unit receptive region 6. The edge detection capability of the edge detection cell 5 can be evaluated by the output thereof when a shield plate 9 which corresponds to the image moves in a direction x from left to right in FIG. 10A and the shielding area with respect to the unit receptive region 6 is successively increased. When the movement of the shield plate 9 is regarded as the describing the coordinate of the shielding boundary and changes in the value (relative value) e-h and the value .vertline.e-h.vertline. (relative value) which are involved in the output signal U.sub.ij.sup.1 are described, it is possible to obtain FIG. 6B for the value e-h and FIG. 6C for the value .vertline.e-h.vertline.. First, the shield plate 9 begins to shield the unit receptive region 6, and the value .vertline.e-h.vertline. increases proportionally to the shielding area until the cells PD.sub.11, PD.sub.21 and PD.sub.31 of the leftmost column are completely shielded. Furthermore, as the shielding progresses and the cells PD.sub.12, PD.sub.22 and PD.sub.32 of the central column are shielded, the value of e, that is, the decrease in the output signal of the center cell PD.sub.22 becomes dominant. The value of .vertline.e-h.vertline. becomes 0 (zero) when exactly 1/2 of the central column made up of the cells PD.sub.12, PD.sub.22 and PD.sub.32 is shielded. When the central column made up of the cells PD.sub.12, PD.sub.22 and PD.sub.32 is completely shielded, the value of .vertline.e-h.vertline. is determined by the output values of the remaining cells PD.sub.13, PD.sub.23 and PD.sub.33. As the shielding progresses further and the cells PD.sub.13, PD.sub.23 and PD.sub.33 of the rightmost column are shielded, the value of .vertline.e-h.vertline. decreases proportionally to the shielding area.
Therefore, according to the conventional edge detection method uses the ON centered receptive field in which the center element out of the plurality of light receiving elements supplies a positive signal to a corresponding threshold element upon receipt of light and the peripheral elements out of the plurality of light receiving elements supply negative signals to the corresponding threshold elements upon receipt of light. The signal from the center element and the signals from the peripheral elements are weighted and an absolute value of a difference between the signals is taken as an edge output. However, when the edge passes the receptive field (receptive region), the ratios of the light receiving areas to the shielded areas become the same for the central portion and the peripheral portion of the receptive field and it becomes impossible to detect the edge, as may be seen from FIGS. 6A through 6C.
FIG. 7 shows the conventional parallel image processing apparatus proposed in the Japanese Published Patent Application No. 50-34901 in more detail. In FIG. 7, those parts which are the same as those corresponding parts in FIGS. 1 through 3 are designated by the same reference numerals.
A parallel image processing apparatus 10 shown in FIG. 7 has an imaging lens 12 which corresponds to a crystalline lens of an eye ball. The photoelectric conversion cell layer 1 is arranged on an optical axis of the imaging lens 12 and corresponds to a retina visual cell. The photoelectric conversion cell layer 1 includes the photoelectric conversion cells 2 which are arranged two-dimensionally in the matrix arrangement. The edge detection cell layer 4 includes the edge detection cells 5 which are arranged two-dimensionally in the matrix arrangement. The edge detection cell 5 has a non-linear characteristic and is coupled to a predetermined number of photoelectric conversion cells 2. A line segment direction detection layer 17 is coupled to the edge detection cell layer 4. The line segment direction detection layer 17 includes a plurality of line segment direction detection cells 18 which are arranged two dimensionally in each direction. For example, the line segment direction detection cell 18 is coupled to a predetermined number of edge detection cells 5 and detects a certain inclination. Such two-dimensional layers are successively coupled to make up a three-dimensional structure.
The edge detection cells 5 of the edge detection cell layer 4 are coupled to the photoelectric conversion cells 2 of the photoelectric conversion cell layer 1 while the line segment direction detection cells 18 of the line segment direction detection cell layer 17 are coupled to the edge detection cells 5 of the edge detection cell layer 4, and the receptive regions of the layers 1, 4 and 17 overlap.
According to the parallel image processing apparatus 10, the imaging lens 12 images a projection image (not shown) of a reading image 19 on the photoelectric conversion cell layer 1. The edge detection cell layer 4 detects the contrast of the projection image based on the output values of the photoelectric conversion cells 2.
As described above, the output signal U.sub.ij.sup.1 of the edge detection cell 5 is large when the projection image covers the center photoelectric conversion cell 2 and is small when the projection image covers the peripheral photoelectric conversion cells 2. The output signal U.sub.ij.sup.1 of the edge detection cell 5 in the reception region which is not covered by the projection image or is completely covered by the projection image is zero. FIG. 8 is a diagram showing a relationship between the photoelectric conversion cell layer 1 and the edge detection cell layer 4 together with an output characteristic of the edge detection cells 5 of the edge detection cell layer 4.
The output signals U.sub.ij.sup.1 of the edge detection cells 5 are supplied to the line segment detection cell layer 17, and each line segment direction detection cell 18 corresponding to the line segment of the projection image detects the inclination of the line segment which is located at a position (x, y) on the photoelectric conversion cell layer 1.
However, according to the parallel image processing apparatus 10, the weighting coefficients are set for the plurality of photoelectric conversion cells which make up one receptive region, so as to obtain the ON centered detection characteristic. For this reason, a weighting circuit comprising a resistor, an inverter and the like is required for each photoelectric conversion cell, and there is a problem in that a large number of circuit elements must be provided with respect to one receptive region. As a result, the circuit structure of the photoelectric conversion cell layer 1 and the edge detection cell layer 4 becomes extremely complex, and the productivity of the parallel image processing apparatus 10 is poor.