Gocator® Smart 3D Laser Line Profile Sensors paired with UR cobots deliver a smart 3D robot vision solution for applications such as measurement and inspection of elongated targets (e.g., cables, pipes, and conduits); applications that require larger fields of view not covered by Gocator® 3D snapshot sensors (such as large scale bin picking); and a wide range of vision guidance, flexible quality inspection, and smart pick-and-place applications. Easy UR integration with Gocator® 3D laser profilers allows production engineers to get a complete vision-guided robotic solution up and running with minimal cost and development time. An example of an end user application is a Gocator® 2450 blue laser profiler mounted to a UR+ cobot performing glue bead scanning, measurement, and inspection in inline consumer electronics assembly processes. This set up is implemented at the application-specific level, where the sensor is mounted to the UR robot to inspect the adhesive bead and ensure it is applied within the correct volume tolerance. Gocator 2450’s blue laser 3D profiling is able to quickly and accurately determine the width, position, height and volume of the applied adhesive. 3D is also contrast-invariant, allowing detection of transparent or translucent glues to generate accurate measurements. In addition, the 2450’s large field of view provides maximum coverage in a single scan.
Gocator® 3D laser profilers are easily mounted onto the UR robot flange using a metal plate. You can then connect the sensor to a robot controller or PC application to perform sensor hand-eye calibration (using the Gocator® URCap plugin) and implement your desired robot movement.
The Gocator® Calibrate node in the URCap automatically performs hand-eye calibration between the sensor and the robot flange. When the node runs, it moves the robot-mounted sensor over the calibration target multiple times to capture scans and return the required poses. After the node has run, the resulting sensor-robot transformation matrix is loaded onto the robot.
Once hand-eye calibration is complete you can connect to the sensor’s web-interface to load a job, trigger a scan, and return positional measurements in the X, Y, and Z axes.