Image sensors typically include an array of a plurality of pixels that sense a red color, a green color, and a blue color, in order to generate a color image. Color image sensors may include an array of pixels that sense a cyan color, a yellow color, a green color, and a magenta color, instead of pixels that sense a red color, a green color, and a blue color. These pixels are all configured to have a structure for sensing visible light, and separate and sense their corresponding colors by using only color filters. Accordingly, the plurality of pixels may include the same light sensing regions and the same driving circuits having the same materials and the same structures, and red, green, and blue pixels may be distinguished from one another by only color filters.
Recently, an image sensor additionally having various functions, such as night vision, thermal photography, and three-dimensional (3D) photography, is in demand, and there have been attempts to integrate a pixel for sensing infrared light or ultraviolet light into an image sensor. However, because an existing pixel structure that senses visible light has difficulty in sensing infrared light or ultraviolet light via only a replacement of color filters, pixels having a different structure from the existing pixel structure are additionally arranged to sense infrared light or ultraviolet light.