In order to improve the efficiency with which an image sensing element takes in light rays incident on the image sensing element (hereinafter “light-gathering efficiency”), an image sensing element adopting a structure like that described in Japanese Patent Application Laid-Open No. 10-229180 has been proposed. In the structure described in Japanese Patent Application Laid-Open No. 10-229180, as shown in FIGS. 14A–14C here, the surface of an image sensing element 25 is provided with a first microlens 21 and a second microlens 22, with FIG. 14B showing a pixel near the optical axis of an image sensing lens (in a central area 25c in FIG. 14A) and FIG. 14C showing a pixel at a position removed from the optical axis (in a periphery area 25a in FIG. 14A). The first microlens 21 is an in-layer lens consisting of a first lenticular film 12 provided atop a color filter film 11, with the top portion of the first lenticular film 12 formed into a convex shape. The second microlens 22 is formed by shaping a lenticular film 13 into a lens. An incidental light ray L concentrated by the second microlens 22 strikes the first microlens 21 (the in-layer lens), where the light ray further undergoes a convergence effect and is directed to a photoelectric converter. At this point, as the formation position of a given pixel approaches the periphery area 25a of the chip on which the image sensing element is set, the first and second microlenses 21, 22 are offset toward the center by distances t1, t2, respectively, so as to avoid degrading light-gathering efficiency at the periphery.
Moreover, a structure has been proposed that efficiently concentrates light striking at an angle at a photo-sensing portion by offsetting the optical axes of the microlenses (on-chip lenses) formed atop each photo-sensing portion toward the center of a solid-state image sensing element from the optical axes of the photo-sensing portions at the peripheral portion of the solid-state image sensing element (for example, Japanese Patent No. 2600250).
Moreover, in recent years, with the solid-state image sensing element used in such image sensing apparatuses as digital still cameras and the like, attempts have been made to improve image quality by increasing the number of pixels while at the same time reducing costs by reducing the size of the chip. Consequently, the size of a single pixel comprising a solid-state image sensing element has continued to shrink annually, as has the surface area of the photo-sensing portion thereof.
However, because photosensitivity declines as the surface area of the photo-sensing portion decreases, a variety of structures have been proposed in order to direct light rays striking the image sensing element to the photoelectric converter efficiently. As one example, there is the structure based on an embodiment described in Japanese Patent Application Laid-Open No. 06-224398, shown in FIG. 15. Reference numeral 65 designates a resin cap layer, formed of a material with a refractive index of approximately 1.6, and 64 designates a low refractive index layer, formed in a space filled with a resin of a refractive index lower than that of the cap layer 65, with air, or with an inert gas such as nitrogen. After providing a flattening layer 66 atop the previous two layers and flattening the surface, a microlens 62 is formed. The foregoing arrangement takes advantage of the fact that when a light ray traveling from the cap layer 65 toward the low refractive index layer 64 exceeds a critical angle the light ray is totally reflected at the interface between the two layers, thus totally reflecting slanting incident light 67 to a photoelectric converter 63.
Moreover, an image sensing element described in Japanese Patent Application Laid-Open No. 5-235313 is provided with a light wave path between a light incidence layer and the photo-sensing portion. This image sensing element improves light-gathering efficiency by providing a symmetrically shaped light wave path composed of a high refractive index material in the light incidence side of the photo-sensing portion and a low refractive index material around the periphery thereof, and totally reflecting the incident light at the interface between the two.
FIG. 16A is a cross-sectional view of a pixel arranged on the periphery of a solid-state image sensing element 30 likely to be obtained by combining the image sensing element described in Japanese Patent Application Laid-Open No. 5-235313 with Japanese Patent No. 2600250, showing light rays striking the image sensing element from an image sensing lens that is not shown. In FIG. 16A, reference numeral 31 designates a microlens for concentrating the incident light onto a photo-sensing portion 33, disposed at a position decentered toward the optical axis of the image sensing lens, not shown. A light wave path 36 composed of material of a high refractive index is formed on the light incidence side of the photo-sensing portion 33, and the incident light refracted by the microlens 31 is totally reflected at the interface between the light wave path 36 and an interlayer insulation portion 35 made of low refractive index material, and directed to the photo-sensing portion 33.
However, the conventional examples described above have the following problem shown in FIGS. 17 and 18. FIGS. 17A and 17B show a state in which a variable focal length image sensing lens array 202 is installed on an image sensing element 201, with FIG. 17A showing a wide angle state and FIG. 17B showing a telephoto state. The image sensing element 201 here is one used widely conventionally, in which the microlens is provided on the image sensing element surface, and it is assumed that no in-layer lens is present.
When an image sensing lens array changes from a wide-angle position to a telephoto position, the position of the pupil of the image sensing lens array 202 changes from T1 to T2. Looked at from the image sensing element 201, the rays of light passing through the lens and striking the image sensing element 201 can all be seen as emanating from the pupil position. The relation of the light rays originating from the pupil of the lens reaching the image sensing element is shown schematically in FIG. 18. Light rays passing through the image sensing lens array 202 and striking the photoelectric converter 3 from the wide angle pupil position in a pixel positioned at the lower left of an image sensing area that gathers in an image of an image sensing subject, when they arrive at the photoelectric converter 3, form an image 77 like that shown in FIG. 19. FIG. 19 shows a plan view of a single pixel, with the optical axis center of the image sensing lens positioned in the upper diagonal right of the diagram. By disposing the microlens provided on the image sensing element at a slight offset portion toward the optical axis of the image sensing lens, a pupil image appears atop the photoelectric converter 73 provided on a silicon wafer 71. Basically the pupil image 77 is formed at a location other than a Poly-Si wiring layer 78 that is present in order to switch electrical charges generated at the photoelectric converter 73, because the Poly-Si wiring layer 78 permits a certain number of light rays to pass but also absorbs many other light rays.
By contrast, when zooming to the telephoto side, the image 77 moves in a diagonally across the pixel as shown in FIG. 20. In a pixel positioned at the bottom of an image sensing area 203 shown in FIG. 18, if the angle formed by the light rays emanating from the center of the pupil and passing near the central axis of the microlens (main light rays) and the lens optical axis is θ1 at the wide-angle side and θ2 at the telephoto side, then the incident angle difference θ=θ1−θ2 generally tends to increase as the image sensing lens zoom ratio increases. In addition, the image sensing lens pupil diameter is determined by the brightness (F number) of the lens, and increases as the lens brightness increases. Accordingly, when the zoom ratio is increased the position of the image of the pupil occurring at the photoelectric converter 73 becomes markedly separated at the wide angle side and the telephoto side (that is, the incident angle difference increases), either overshooting the photoelectric converter or being eclipsed by an AL wiring layer 79. Moreover, as the lens brightens a larger image occurs at the photoelectric converter, thus further decreasing the number of light rays striking the photoelectric converter 73 and leading to a drastic decline in the light-gathering efficiency with which light rays entering the pixel are gathered by the photoelectric converter.
In the case of an optical instrument like that the pupil position does not change, that is, the focal distance is fixed (that is, a single focus lens) and the lens cannot be changed, the above-described problem can be solved by offsetting the microlens by an amount unique to that lens, so that the center of the pupil image is at the center of the photoelectric converter. However, in the case of an optical instrument in which the pupil position changes, such as a zoom lens or an interchangeable lens system, for reasons like those described above simply offsetting the microlens is not enough to solve the problem. The same problem arises with a structure having an in-layer lens (that is, the first microlens 21) like that shown in FIG. 14B, for example. In this case, the problem is that some of the light rays concentrated by the microlens do not strike the in-layer lens, or, if they do strike the in-player lens, they are not directed to the photoelectric converter, and consequently that portion of the light is lost.
Moreover, in the case of a structure like that shown in FIG. 15, because the interface between the high refractive index cap layer 65 and the low refractive index layer 64 forms an arc of small R, as the angle of incidence increases the light rays striking the arc pass through without being totally reflected and instead enter another pixel.
In addition, even with a camera in which a single focus lens is integrated into the body of the camera, if the image sensing lens is made smaller in order to reduce the size of the camera, the distance between the solid-state image sensing element and the image sensing lens exit pupil decreases, thus further increasing the angle at which light strikes the solid-state image sensing element. Consequently, with an image sensing element having a structure like that shown in FIG. 16A, as shown in the light trace diagram of FIG. 16B, with a solid-state image sensing element having the conventional symmetrical light wave path, beams of light that do not satisfy total reflection conditions arise at the refractive index interface and incident light cannot be concentrated efficiently.
Thus, as described above, the defect of the conventional art is that, because the angle of the light striking the solid-state image sensing element differs depending on the position of the pixels that comprise the solid-state image sensing element, the sensitivity varies according to the position of the pixel when light wave paths of the same shape are formed.