Various types of product packages, such as cigarette packages or pharmaceutical containers, are fabricated in a multi-stage process at an assembly plant. After the raw material web has been cut into an appropriate shape, the preformed packages are transported along a conveyor belt, before the folding and bonding commences. At this stage, it is necessary to conduct an inspection of the preformed packages. The precise contour of the preformed package is determined, in order to ensure that it is properly aligned in the correct orientation on the conveyor belt. Once the preformed package is properly aligned, a comparison can be made with an optimal package template, to verify that the packages match within acceptable tolerance limits. In this manner, the presence of any errors or irregularities (i.e., extra fragments or missing sections) in the preformed package can be detected.
Current inspection systems include visual imaging elements operative for examining each individual preformed package. However, processing the images to determine the overall contour of the package is very time-consuming. Edge detection is a fundamental problem in the field of image processing. Edge detection refers to the ability to find, sufficiently accurately and quickly, the transition point between the object and the background. The ability to determine the contour of the preformed package establishes a limit on the processing speed of this stage of the fabrication process, thereby reducing the total output. In general, there is an overall tradeoff between the processing speed and the orientation accuracy.
U.S. Pat. No. 5,917,602 to Bonewitz et al, entitled “System and method for image acquisition for inspection of articles on a moving conveyor”, is directed to an image acquisition system and method for inspecting a container on a moving conveyor. The conveyor transports the container from a container molding apparatus (e.g., an individual section machine), where the container is formed to the desired shape. The container continues along the conveyor until it reaches an inspection station. The image acquisition system inside the inspection station includes a line scan camera, a lighting assembly, a speed monitor (i.e., a rotary encoder), and electronic controls. The lighting assembly is positioned across the conveyor opposite the camera, defining an imaging area between them.
As the container enters the imaging area, the lighting assembly illuminates the container, while the camera generates line images of the side wall of the container. The electronic control, coupled with the inspection station, includes a computer and a monitor. The computer extracts variations in shading using visual imaging techniques, to detect production defects, contamination and damage (e.g., blisters, improper annealing, embedded foreign objects and variations in glass density) in the container. The image analysis includes edge detection routines known in the art (e.g., Sobel or Prewit algorithms), which analyze gray level changes in defined window regions for detecting the profile of the container. The computer may perform diagnostic operations to determine the cause of any detected defects, and to prescribe corrective actions to prevent further defects from occurring to the new containers. A feedback signal may be sent to the container molding apparatus, to correct the problem or to stop the molding process in order to allow further diagnosis. The rotary encoder determines the speed of the container as a function of the conveyor motion, and generates a feedback signal respective of the speed of the container. The electronic control determines the required operating speed of the camera based on this signal.
U.S. Pat. No. 5,991,041 to Woodworth entitled “Method and apparatus for measuring dimensions of objects on a conveyor”, is directed to a system and method for measuring the length, width and height of an object, such as a carton, as it is transported on a conveyor. The system includes a pair of laser light sources, a pair of charge-coupled device (CCD) cameras, a digital computer, a light curtain, and a pulse tachometer. The laser light sources are disposed on opposite ends of a U-shaped frame which traverses the width of the conveyor. The cameras are mounted on opposite ends of the top bar of the frame. The computer is mounted below one of the laser light sources. The cameras are positioned such that the center of their respective coverage areas is the center of the conveyor. The tachometer is disposed on a cross member situated below the top surface of the conveyor. The light curtain includes a beam array emitter and a beam array receiver, located on either side of the conveyor. The beam array emitter includes a plurality of photo-transmitters spaced apart at fixed distances, whereas the beam array receiver includes a plurality of photo-receivers, configured to receive light from a respective photo-transmitter.
The light sources and cameras can be considered laser triangulation rangefinders. Each light source shines light toward the side of the object, along a path perpendicular to the direction of travel of the object, and at a height slightly above the surface of the conveyor. The associated CCD camera detects the reflected light from the object, and determines the distance of the object from an edge of the conveyor. The tachometer counts wheel revolutions to measure the linear distance traveled by the conveyor, and thus the distance traveled by the object. The light curtain measures the highest point of the object, based on which emitter-receiver pairs are blocked by the object as it passes through the light curtain. The computer receives data from the laser triangulation rangefinders (i.e., two side profiles of the object) and data from the light curtain (i.e., a top profile of the object). The computer determines the four edges of the object from the side profiles, and then calculates the length and width of the object. The computer calculates the height of object from the top profile. The calculations assume that the object is a rectangular solid.
U.S. Pat. No. 6,191,850 to Chiang, entitled “System and method for inspecting an object using structured illumination”, is directed to a machine vision system for inspecting a surface for defects, such as during the manufacture process of “smart cards”. The system includes an illumination assembly, a camera, and a computer with a display. The illumination assembly includes a fiber optic cable, and a projecting element. The fiber optic cable transmits light from a remote light source toward the projecting element along a first optical axis. The projecting element includes a pair of diffusers and a beam splitter. A grid pattern is formed on the second diffuser. The grid pattern is generally a matrix of crossing horizontal and vertical lines having uniform spacing and thickness, but may include any repeating or intersecting pattern. The light from the fiber optic cable passes through each of the diffusers to the beam splitter along the first optical axis. The beam splitter directs the light toward the surface of the object to be inspected, along a second optical axis perpendicular to the first optical axis, thereby projecting a grid image on the object surface. The light from the object surface is reflected back toward the projecting element, and then directed toward the camera via the beam splitter, along an optical axis perpendicular to the optical plane of the camera. The camera transmits data to the display for viewing, via a video processor or frame grabber.
The computer implements an analysis of the projected grid pattern to determine underlying defects on the object surface. A region of the surface to project the grid is established, and the object or system elements may be moved accordingly. The grid is located by means of a fiducial having a different appearance than the grid features. Specific features of the grid are then identified and scored, using pattern recognition. The feature scores are compared with tolerance limits, and if all feature scores are not within the tolerance limits, the surface is rejected as defective. Otherwise, the grid features are ranked by row and column, to generate a matrix of feature locations. An ideal grid is constructed based on the ranking, and the locations of the actual grid features are compared with the locations of the ideal grid features. The deviations of the actual grid features from the ideal grid features are compared with tolerance limits, and the surface is rejected as defective if the tolerance limits are exceeded. Otherwise, the surface is accepted.
U.S. Pat. No. 6,348,696 to Alt et al entitled “Method and device for detecting the position of the edge of a moving material web”, is directed to a device and method for detecting the edge of a material web. The device includes a light source, a sensor unit made up of a plurality of sensors, and a microcontroller. Each sensor is a photodiode located in the bore of a plastic panel. The sensors are distributed along the panel equidistantly and transversely to the direction of movement of the material web. The light source (i.e., a fluorescent tube) emits light toward the material web, which absorbs or reflects part of the light rays, depending on the position of the edge. The light rays pass through an optical element, which absorbs all the light rays except those that extend perpendicular to the material web. The light rays then reach the sensor unit. Each sensor is coupled to an analog multiplexer and a current/voltage converter, which converts the photo-current into a proportional voltage. The output is coupled in turn to an A/D converter, which generates a digital value respective of the voltage. These values are received by the microcontroller, and stored in a storage unit.
Each sensor generates a signal that is based on the amount of the material web covered by that sensor. Accordingly, the inner sensors which are completely covered by the material web generate an idle signal, whereas the outer sensors which are not covered by the material web at all generate a maximum signal. The sensors located in the area of the edge of the material web generate signals in between the idle signal and the maximum signal. The function of the sensor signals therefore includes two constant ranges (i.e., maximum and idle), with a transition range in between. The edge of the material web is determined based on the turning point of the function of the sensor signals. The microcontroller determines the turning point by differentiating the function of the signal twice and computing a zero coefficient of the second derivate, or alternatively, by numerical differentiation and numerical searching of the maximum of the first derivative. Further alternatively, a fit function is approximated to the function of the sensor signals, and the turning point is calculated based on the fit parameters.
U.S. Pat. No. 6,373,520 to Cadieux, Jr. et al entitled “System and method for visually inspecting a cigarette packaging process”, is directed to an inspection system and method for detecting and removing non-conforming cigarette packages during the fabrication process. The system includes an inspection station and an ejection station. The inspection station includes a first inspection device and a second inspection device. At a packaging machine, the cigarette packages are packed in soft or hard pack wrappers. The cigarette packages travel in single file along a conveyor path from the packaging machine to the inspection station. The packages pass through each of the inspection devices on different conveyor belts, and then toward the ejection station. The first inspection device may be a foil detection device, which determines the presence or absence of a foil wrapper on the package. The foil detection device includes a plurality of sensors (i.e., photosensor cells) which detects the presence of foil on four sides of the package. The second inspection device may be a vision inspection system, which captures images of at least one surface of the package. The vision inspection system includes three cameras, a photosensor, a light source and a controller. Each camera is oriented to view at least one different surface of the package. The photosensor detects the arrival of the package and signals the controller, which activates the light source (e.g., a group of fiberoptic bundles). The package enters a reflector housing, on which the fiberoptic bundles may be mounted. The light source provides a flash of light to illuminate the package when it is at the desired viewing position. The light is dispersed by baffles, and reflected off a flat white coating on the interior curved surface of the reflector housing. The package is thereby illuminated with a consistent and even distribution of diffused light. The cameras capture images of the respective package surfaces. The images are processed by vision inspection software, which identifies non-conforming features in the packages. The controller receives signals to identify the non-conforming packages, and instructs the ejection station to remove the non-conforming packages from the conveyor path. The ejection station includes two ejection mechanisms, such that packages exhibiting a first non-conforming feature (i.e., by the first inspection device) are deflected to a first location, and packages exhibiting a second non-conforming feature (i.e., by the second inspection device) are deflected to a second location. Each ejection mechanism includes an air jet, which is oriented to deflect the non-conforming package into a container. The controller activates a high speed air valve to operate the air jet. The ejection station may further include photocells to detect the arrival of a package and to confirm the ejection of the package through a timed sequence programming operation.