Devices for imaging body cavities or passages in vivo are known in the art and include endoscopes and autonomous encapsulated cameras. Endoscopes are flexible or rigid tubes that pass into the body through an orifice or surgical opening, typically into the esophagus via the mouth or into the colon via the rectum. An image is formed at the distal end using a lens and transmitted to the proximal end, outside the body, either by a lens-relay system or by a coherent fiber-optic bundle. A conceptually similar instrument might record an image electronically at the distal end, for example using a CCD or CMOS array, and transfer the image data as an electrical signal to the proximal end through a cable. Endoscopes allow a physician control over the field of view and are well-accepted diagnostic tools. However, they do have a number of limitations, present risks to the patient, are invasive and uncomfortable for the patient, and their cost restricts their application as routine health-screening tools.
An alternative in vivo image sensor that addresses many of these problems is capsule endoscope. A camera is housed in a swallowable capsule, along with a radio transmitter for transmitting data, primarily comprising images recorded by the digital camera, to a base-station receiver or transceiver and data recorder outside the body. The capsule may also include a radio receiver for receiving instructions or other data from a base-station transmitter. Instead of radio-frequency transmission, lower-frequency electromagnetic signals may be used. Power may be supplied inductively from an external inductor to an internal inductor within the capsule or from a battery within the capsule.
U.S. Pat. Nos. 6,709,387 and 6,428,469 describe details of such a system. An autonomous capsule camera system with on-board data storage was disclosed in the U.S. Pat. No. 7,983,458, issued on Jul. 19, 2011.
An optical imaging system and method for producing panoramic images exhibiting a substantial field of view for capsule camera applications are disclosed in U.S. Pat. No. 7,817,354, issued on Oct. 19, 2010. In one embodiment, this optical imaging system comprises a four-sided reflective pyramid and four folded centers of perspective at which entrance pupils of individual image sensor array may be positioned. Each of the image sensor arrays so positioned has an optical axis associated therewith that is folded by the reflective facets of the pyramid. Each individual image sensor array positioned at a respective folded center of perspective each has a horizontal field-of-view (HFOV) of at least 90.degree. Therefore, a composite HFOV constructed from the combined individual fields-of-view is 360°.
FIG. 1 illustrates an example of a swallowable capsule system 100 according to an embodiment as disclosed in U.S. Pat. No. 7,817,354. Capsule system 100 is entirely autonomous while inside the body, with all of its elements encapsulated in a capsule housing 110 that provides a moisture barrier, protecting the internal components from bodily fluids. Capsule housing 110 is transparent, so as to allow light from the light-emitting diodes (LEDs) of illuminating system 12 to pass through the wall of capsule housing 110 to the lumen walls, and to allow the scattered light from the lumen walls to be collected and imaged within the capsule. Capsule housing 110 also protects lumen from direct contact with the foreign material inside capsule housing 110. Capsule housing 110 is provided a shape that enables it to be swallowed easily and later to pass through of the GI tract. Generally, capsule housing 110 is sterile, made of non-toxic material, and is sufficiently smooth to minimize the chance of lodging within the lumen.
As shown in FIG. 1, capsule system 100 includes illuminating system 120 and a camera that includes optical system (140A and 140B) and image sensor 160. The optical system comprises multiple lens sets. While only two lens sets (140A and 140B) are shown in FIG. 1, more lens sets (e.g. 4) may be used. The image sensor 160 comprises multiple sensor arrays to match the number of lens sets in order to capture the images projected by the multiple lens sets. An image captured by image sensor 160 may be processed by process sub-system and battery 180, which provides all the needed processing and controls (e.g., image processing, compression, storage, etc.).
Illuminating system 12 may be implemented by LEDs. In FIG. 1, the LEDs are located adjacent the camera's aperture, although other configurations are possible. The light source may also be provided, for example, behind the aperture. Other light sources, such as laser diodes, may also be used. Alternatively, white light sources or a combination of two or more narrow-wavelength-band sources may also be used. White LEDs are available that may include a blue LED or a violet LED, along with phosphorescent materials that are excited by the LED light to emit light at longer wavelengths. The portion of capsule housing 10 that allows light to pass through may be made from bio-compatible glass or polymer.
The optical imaging system disclosed in U.S. Pat. No. 7,817,354 is capable of providing a combined individual fields-of-view of 360°. In U.S. Pat. Nos. 9,001,187 and 8,717,413, another optical system is disclosed that is also capable of providing a combined individual fields-of-view of 360°, that discloses lenses of negative refractive power positioned before the first surface of each prism. FIG. 2 illustrates a perspective view of a partially assembled panoramic camera system employing four folded imagers according to U.S. Pat. No. 8,717,413. In particular, shown positioned within lens module housing 210 are spindle 220, four prisms 250, four lens elements 240 of negative refractive power. Shown in this figure are notches 255 formed on a front face of each prism 250, and mating tabs 845 formed on back (prism side) of each lens 245. Such notches and tabs provide a secure, positive alignment between the lenses 240 and the prisms 250.
In U.S. Pat. No. 9,118,850, a camera system with multiple pixel arrays on a chip is disclosed. FIG. 3 shows an exemplary layout 300 for an integrated sensing component with multiple pixel arrays according to the present invention to support the four images corresponding to the four optical paths. The multiple pixel arrays and associated timing/control circuits and common readout chain are implemented on a common substrate 350, such as a semiconductor material. The integrated multiple pixel-array image sensing component comprises separate pixel arrays 311, 312, 313 and 314. The pixel arrays are configured so that each pixel array is located and oriented properly to capture a corresponding image formed by a lens sub-system. Besides pixel arrays 311, 312, 313 and 314, other components may also be formed on substrate 350. For example, timing and control block 320, one or more readout chains (e.g., read out chain 330) for reading out electrical output signals from the pixel arrays, and I/O ring structures 340 can also be formed on the same substrate 350. The readout chain 330 processes the output signals from the pixel arrays before sending out the electrical signal through I/O ring structure 340. In FIG. 3, the center of the four sensor arrays is indicated by a black dot 360.
When coupled with a matched optical system, the multiple pixel arrays can capture scenes in a very wide field of view or even a 360° panoramic view. The camera system uses one or more image sensor IC chips each having multiple pixel arrays on the same semiconductor substrate (i.e., “multiple pixel arrays on a chip”). Additional electronic components for further signal processing of the captured images are also disclosed.
In a conventional camera system with a single lens set, the single lens set and the sensor(s) has to be aligned in order to optimize the captured image quality. The conventional camera system with a single lens set has only one field of view. The system may use multiple sensors corresponding to multiple color components. In such cases, dichroic beam splitters are often used to direct light from the field of view to individual sensors. For a capsule camera with multiple lens sets coupled to multiple sensor arrays, the alignment between the multiple lens sets and the multiple sensor arrays becomes very critical. With the increasing resolution of the sensor arrays, the alignment becomes even more critical. It is desirable to develop techniques for reliably aligning the multiple lens sets and the multiple sensor arrays.