Aircraft manufacturing processes have relied on mechanical fixtures to hold workpieces being assembled and mechanics to align tools that are performing manufacturing operations on the workpieces. Increasingly, robots with tools are being used to perform manufacturing functions that previously required such manual aligning operations. However, the accuracy of the robotic work operation relies on knowing the placement of the workpiece relative to the robot and its associated tool or tools.
A tool may be aligned by a robot operator using a tool mounted camera to locate a particular feature, such as a hole or fiducial mark. Customarily, the camera is very slowly positioned close to the workpiece using numerically controlled program commands aided by manual intervention in order to accurately register a small feature against a cluttered background. However, the robotic arm on which the camera is located must be prevented from inadvertently contacting the workpiece or risk damage to any or all of the camera, the robotic arm, or the workpiece. This close proximity placement may involve the use of mechanical feelers or optical sensors, and time consuming visual inspection by the operator. When enough features have been semi-autonomously identified to derive the workpiece coordinate system in three dimensions of rotation and translation, the workpiece can be registered to the coordinate system of the robot and the operator can begin a fully autonomous robotic assembly operation, such as cutting, drilling, fastening, or welding. The semi-autonomous alignment operations described above are labor intensive and can add 10's of minutes or more to the manufacturing operations cycle.
3D locating devices, such as laser range finding equipment or laser projectors are large, expensive, and introduce their own post calibration residual bias errors in addition to the end effector camera usually relied on for machine vision measurements of workpiece features.