Endoscopes are used in the medical field, for imaging the inner wall of a lumen of the body of a patient (e.g., colon, ureter), as well as providing the surgeon a view of an internal region of the body during a minimal invasive surgery (MIS), such as laparoscopy and brain surgery. A display can display the image of an object picked up by the endoscope, either as a two-dimensional image, or a set of the right view and the left view of the object, in which case the surgeon can perceive a stereoscopic perception of the object, by using stereoscopic spectacles. The stereoscopic perception provides depth to the image, as if the surgeon was viewing the object with naked eyes, thereby aiding the surgeon to perform a more successful operation.
Stereoscopic endoscopes are known in the art. Such endoscopes generally include an optical assembly at the tip thereof, to pick up light beams respective of a right view and a left view of the object. The endoscope further includes two image detectors, such as charge-coupled device (CCD), to detect a right image and a left image of the object, according to the light beams which the optical assembly projects on the respective CCD. The CCD's are connected to a processor which produces the right image and the left image according to the output of the CCD's, and directs a display to display the right image and the left image.
The small diameter of the endoscope restricts the size of the optical elements, such as optical assembly, objective and CCD, which are assembled within the endoscope, thereby limiting the resolution of the final image. Much effort has been expended in order to increase the resolution of the image which the display displays. One way to increase the resolution, is by employing a CCD having greater number of cells. To that end, the CCD is mounted at the proximal end of the endoscope, where ample room is available, and light is transmitted to the CCD from the tip thereof, by employing a relay lens system. Another avenue is employment of two set of CCD's, each set including three CCD's, one for each of the red, green, and blue colors.
U.S. Pat. No. 6,306,082 B1 issued to Takahashi et al., and entitled “Stereoendoscope wherein Images Having Passed Through Plural Incident Pupils are Transmitted by Common Relay Optical Systems”, is directed to an endoscope which employs a series of relay lenses to project two images of an object on a single image taking device, including a lenticular lens in front of the image taking device. The endoscope includes two objective optical systems, a relay lens system, an image taking device, a lenticular lens, a light source apparatus, a camera control unit (CCU), a scan converter, a color monitor, and shutter spectacles.
The two objective optical systems are identical and each of them is made of optical lenses of the same characteristics. The light source apparatus is connected to the distal end of the endoscope, by a light guide. The CCU is connected to the image taking device and to the scan converter. The scan converter is connected to the color monitor. The two objective optical systems are located at a distal end of the endoscope. The relay lens system is located between the two objective optical systems and the image taking device. The lenticular lens is located in front of the image taking device. The image taking device is located in a gripped section of the endoscope, at a proximal end of the endoscope.
The two objective optical systems form a right image and a left image of an object, at a parallax from each other, and transmit the right and left images to the relay lens system. The relay lens system multiple-transmits the right and left images to image taking device. The lenticular lens forms the right and left images at intervals of one row or one line, on the image taking device. The CCU processes the signals received from the image taking device, the scan converter converts the signal from the CCU to a video signal, and the monitor displays the video signal. The shutter spectacles enable a user to view a stereoscopic image of the object.
U.S. Pat. No. 6,817,975 B1 issued to Farr et al., and entitled “Endoscope” is directed to an endoscope having an objective, a relay lens system, an ocular lens system and a camera. The objective is located at a distal end of the endoscope. The ocular lens system is located at a proximal end of the endoscope. The relay lens system is located between the objective and the ocular lens system. The camera is located behind the ocular lens system. The objective is constructed such that a first intermediate image of an object, falls within the glass portion of the most proximal portion of the objective lens, in close proximity to the distal end of the relay lens system.
U.S. Pat. No. 6,624,935 B2 issued to Weissman et al., and entitled “Single-Axis Stereoscopic Video Imaging System with Centering Capability”, is directed to a stereoscopic imaging system. The stereoscopic imaging system includes a single axis optical system, an electronic shutter, an aperture and a single or multiple sensor imaging device. The single axis imaging system is a video lens, photographic lens, microscope, telescope or endoscope. The electronic shutter is a device which is electronically controlled to alternately block the transmission of light. The electronic shutter is a liquid crystal device. Alternatively, the electronic shutter is a polarization selector.
The aperture is located behind the single axis optical system. The electronic shutter is located between the single axis optical system and the aperture. The single or multiple sensor imaging device is located behind the aperture. The electronic shutter alternately blocks a right view and a left view of a target. The right view and the left view are presented to the single or multiple sensor imaging device for viewing the right view and the left view of the target stereoscopically.
U.S. Pat. No. 6,832,985 B2 issued to Irion et al., and entitled “Endoscopic System with Instrument Position and Orientation Display”, is directed to an endoscopic system for displaying information respective of the position and orientation of an instrument, as well as an image detected by an endoscope. The endoscopic system includes an endoscope, a video unit, a monitor, an assessment and control unit, and a position sensing device. The endoscope includes an endoscope objective and a relay lens system. The instrument serves to perform an operation such as diagnosis or therapeutical treatment. The assessment and control unit is a video processor unit.
The endoscope objective is located at a distal end of the endoscope. The video unit is located at a proximal end of the endoscope. The position sensing device is attached to the instrument. The relay lens system transmits an image detected by the endoscope objective, to the video unit. The endoscope objective is focused on an image plane of an object and is associated with a respective coverage field cone. The assessment and control unit is connected to the position sensing device and to the monitor. The position sensing device sends a signal respective of the orientation of the instrument relative to the coverage field cone, to the assessment and control unit. The monitor displays a symbol indicating the orientation of the instrument, in addition to an image detected by the endoscope.
Reference is now made to FIG. 1, which is a schematic illustration of an endoscope generally referenced 1, as known in the art. Endoscope 1 includes an elongated endoscopic housing 2, a right prism 4, a left prism 6, an aperture stop 8, an objective 10, a lenticular lens layer 12 and a light sensor array 14. Aperture stop 8 includes a right pupil 16 and a left pupil 18. Right prism 4 and left prism 6 are located in front of right pupil 16 and left pupil 18, respectively. Objective 10 is located behind aperture stop 8. Lenticular lens layer 12 is located between objective 10 and light sensor array 14. A processor 20 is connected with light sensor array 14 and with a display 22. Right prism 4, left prim 6, aperture stop 8, objective 10, lenticular lens layer 12 and light sensor array 14 are located at a distal end of elongated endoscopic housing 2.
Endoscope 1 is inserted into a body cavity 24 of a patient (not shown), in order to detect an image of an object 26. Object 26 is located in front of right prism 4 and left prism 6. Right prism 4 receives a light beam 28A respective of a right view of object 26. Left prism 6 receives a light beam 30A respective of a left view of object 26. Light beam 28A reflects within right prism 4, passes through right pupil 16 and objective 10, to strike a lenticular lens 32 of lenticular lens layer 12, as a light beam 28B. Light beam 30A reflects within left prism 6, passes through left pupil 18 and objective 10, to strike lenticular lens 32 as a light beam 30B. Lenticular lens 32 separates light beams 28B and 30B, and directs light beams 28B and 30B to adjacent cells 34R and 34L of light sensor array 14, respectively.
Processor 20 produces a video output respective of the right view and left view of object 26, according to an output of light sensor array 14, for display 22 to display a right image and a left image of object 26. A user can perceive a stereoscopic sensation of object 26, by viewing display 22 via a stereoscopic pair of spectacles (not shown).
Reference is now made to FIG. 2, which is a schematic illustration of an endoscope generally referenced 40, as known in the art. Endoscope 40 includes an elongated endoscopic housing 42, a front aperture stop 44, an objective 46, an optical relay assembly array 48, a rear aperture stop 50, a front right lens 521R, a front left lens 521L, a rear right lens 522R, a rear left lens 522L, a right light sensor array 54R, and a left light sensor array 54L. Front aperture stop 44 includes a front right pupil 56R and a front left pupil 56L. Optical relay assembly array 48 includes a plurality of optical relay assemblies 581 and 582. Rear aperture stop 50 includes a rear right pupil 60R and a rear left pupil 60L.
An object 62 is located within a body cavity 64 of a patient (not shown). Front aperture stop 44 is located between object 62 and objective 46. Optical relay assembly array 48 is located between objective 46 and rear aperture stop 50. Front right lens 521R and front left lens 521L are located in front of rear right pupil 60R and rear left pupil 60L, respectively. Rear right lens 522R and rear left lens 522L are located behind rear right pupil 60R and rear left pupil 60L, respectively. Right light sensor array 54R and left light sensor array 54L are located behind rear right lens 522R and rear left lens 522L, respectively. Front aperture stop 44, objective 46, optical relay assembly array 48, rear aperture stop 50, front right lens 521R, front left lens 521L, rear right lens 522R, rear left lens 522L, right light sensor array 54R, and left light sensor array 54L are located within elongated endoscopic housing 42. A processor 66 is connected with right light sensor array 54R, left light sensor array 54L, and with a display 68.
Objective 46 receives light beams 70A and 72A, respective of a right view and a left view, respectively of object 62, through front right pupil 56R and front left pupil 56L, respectively. Objective 46 projects a right image 74R and a left image 74L of object 62, according to light beams 70B and 72B, respectively, on a front image plane (not shown) of optical relay assembly 481. There is a disparity δ between right image 74R and left image 74L. Optical relay assembly array 48 transmits light beams 70B and 72B there through, in a multiple manner, to project a right image 76R and a left image 76L of object 62, on a rear image plane (not shown) of optical relay assembly 482, according to light beams 70C and 72C, respectively. There is the same disparity δ between right image 76R and left image 76L.
Front right lens 521R transmits right image 76R to rear right lens 522R, through rear right pupil 60R. Front left lens 521L transmits left image 76L to rear left lens 522L, through rear left pupil 60L. Rear right lens 522R projects a light beam 70D respective of the right view of object 62, on right light sensor array 54R. Rear left lens 522L projects a light beam 72D respective of the left view of object 62, on left light sensor array 54L. Processor 66 produces a video output respective of the right view and left view of object 62, according to an output of right light sensor arrays 54R and left light sensor array 54L, for display 68 to display a right image and a left image of object 62. A user can perceive a stereoscopic sensation of object 62, by viewing display 68 via a stereoscopic pair of spectacles (not shown).
Reference is now made to FIGS. 3A, and 3B. FIG. 3A is a schematic illustration of an endoscope generally referenced 100, as known in the art. FIG. 3B is a schematic illustration of each of the right image detector assembly and the left image detector assembly, of the endoscope of FIG. 3A.
With reference to FIG. 3A, endoscope 100 includes an elongated endoscopic housing 102, a front aperture stop 104, an objective 106, an optical relay assembly array 108, a rear aperture stop 110, a front right lens 1121R, a front left lens 1121L, a rear right lens 1122R, a rear left lens 1122L, a right image detector assembly 114R, and a left image detector assembly 114L. Front aperture stop 104 includes a front right pupil 116R and a front left pupil 116L. Optical relay assembly array 108 includes a plurality of optical relay assemblies 1181 and 1182. Rear aperture stop 110 includes a rear right pupil 120R and a rear left pupil 120L.
An object 122 is located within a body cavity 124 of a patient (not shown). Front aperture stop 104 is located between object 122 and objective 106. Optical relay assembly array 108 is located between objective 106 and rear aperture stop 110. Front right lens 1121R and front left lens 1121L are located in front of rear right pupil 120R and rear left pupil 120L, respectively. Rear right lens 1122R and rear left lens 1122L are located behind rear right pupil 120R and rear left pupil 120L, respectively. Right image detector assembly 114R and left image detector assembly 114L are located behind rear right lens 1122R and rear left lens 1122L, respectively. Front aperture stop 104, objective 106, optical relay assembly array 108, rear aperture stop 110, front right lens 1121R, front left lens 1121L, rear right lens 1122R, rear left lens 1122L, right image detector assembly 114R and left image detector assembly 114L are located within elongated endoscopic housing 102. A processor 126 is connected with right image detector assembly 114R, left image detector assembly 114L, and with a display 128.
With reference to FIG. 3B, an image detector assembly 150, includes three prisms 152, 154, and 156, and three light sensor arrays 158R, 158G, and 158B. Light sensor array 158R detects an image (not shown) in a red range of wavelengths. Light sensor array 158G detects the image in a green range of wavelengths. Light sensor array 158B detects the image in a blue range of wavelengths. A first surface 160 of prism 154 makes contact with a surface 162 of prism 152. A second surface 164 of prism 154 makes contact with a surface 166 of prism 156. Light sensor array 158R is located behind a surface 168 of prism 152. Light sensor array 158G is located behind a surface 170 of prism 154. Light sensor array 158B is located behind a surface 172 of prism 156. Processor 126 is connected with light sensor arrays 158R, 158G, and 158B.
A portion of a light beam 174A reflects from surface 162 as a light beam 174B and strikes light sensor array 158R. Another portion of light beam 174A passes through prisms 152, 154, and 156, to strike light sensor array 158B as a light beam 174B. A further portion of light beam 174A passes through prism 152, and reflects from surface 164, as a light beam 174C, to strike light sensor array 158G. In this manner, light sensor arrays 158R, 158G, and 158B, detect an image (not shown) of an object (not shown), in a red, a green, and a blue range of wavelengths, respectively.
With reference back to FIG. 3A, right image detector assembly 114R receives a light beam 176 respective of a right view image 178R, of object 122, via rear right lens 1122R, similar to the manner light sensor array 54R (FIG. 2) receives light beam 70D respective of right image 74R of object 62, via rear right lens 522R. Left image detector assembly 114L receives a light beam 180 respective of a left view image 178L, of object 122, via rear left lens 1122L, similar to the manner light sensor array 54L receives light beam 72D respective of left image 74L of object 62, via rear left lens 522L.
Processor 126 produces a color video output respective of the right view and left view of object 122, according to an output of right image detector assembly 114R and left image detector assembly 114L, for display 128 to display a right color image and a left color image of object 122. A user can perceive a stereoscopic sensation of object 122, in color, by viewing display 128 via a stereoscopic pair of spectacles (not shown).