Search
Menu
Sheetak -  Cooling at your Fingertip 11/24 LB

Vision-Powered Cobots Improve Speed and Quality of Inspections

Facebook X LinkedIn Email
Cobots bring added flexibility to inspecting multiple sides of a product, without the need for a more complex mechanical structure.

RENATO OSAKI, OMRON AUTOMATION

Cobots assisted by machine vision with AI can aid manufacturers with automating assembly and inspection processes, and offer numerous advantages in industries from automotive and packaging to food electronics and pharmaceuticals.

The first cobot was invented in the ’90s, with industrial cobots becoming available in the early 2000s. Today’s cobots perform additional functions in manufacturing, including pick and place and assembling and disassembling products. They work with machine safety systems by automatically detecting human workers in their vicinity and reducing their speed and force accordingly to prevent injuries.



A cobot inspecting automotive parts. Courtesy of Omron Automation.

Machine vision acts as the cobot’s eyes — reading, measuring, and helping the cobot understand its environment more effectively. Cobots can bring a product to a machine vision camera or bring the camera to a product. If multiple sides of a product must be inspected, the cobot can reposition itself or the product for each inspection.

Reducing the number of cameras

Without a cobot, multiple cameras would normally need to be installed to inspect large parts. The number of cameras needed depends on the size and number of parts being inspected in the field of view, along with the complexity of the inspections. For large part inspection, the cobot can travel long distances to multiple locations. Cobots can easily manipulate products or cameras in the inspection field.

A bin picking system that incorporates a 3D vision-guided cobot. Courtesy of Omron Automation.


A bin picking system that incorporates a 3D vision-guided cobot. Courtesy of Omron Automation.

Collaborative robots and vision systems can automate repetitive inspection tasks that may cause ergonomic issues and job dissatisfaction for their human counterparts.
Cobots bring more flexibility to inspecting multiple sides of a product without the need for a more complex mechanical structure. They enable faster machine setup when a high variety of products must be inspected, and different paths and tasks can be easily programmed and selected automatically. The positions can be defined prior to the operation using a hand-guiding function — an intuitive option to manually move the cobot to the inspection points and recording positions. The cobot software can also be used to edit the path or create new ones between two or more of these positions. Programmers can use a variety of technology, including laptops, tablets, or cobot teach pendants, to access the cobot software.

Collaborative mode

To augment a cobot’s speed and reach, an external conveyance (usually a power-driven slide with rail bearings, a ball screw, and a servo motor) may be added as a seventh axis. This helps the cobot to increase its work envelope. To overcome challenges of throughput, a cobot can operate at full speed when an operator is not sharing the same workspace. Safety sensors, usually in the form of safety light curtains (arrays of emitting LED lights with receiving photoelectric sensors), or a safety laser scanner, are added to monitor the presence of humans. If humans are detected, the cobot will switch to a collaborative mode that reduces torque and speed; should a human operator come into accidental contact with the robot, it would be moving slowly enough that the risk of injury would be minimal.

A cobot and a vision system. Courtesy of Omron Automation.


A cobot and a vision system. Courtesy of Omron Automation.

Additional considerations for the cobot and machine vision work cell include the need for ample electrical power and pressurized air/vacuum for the end-effector if necessary. For the vision aspect of the application, it is better for the deployment area to be free of sunlight. Other types of ambient light can be adjusted with filters and shrouding.

For end-of-arm cameras, the cobot will generally have a programmed position at the correct height above the parts and will move to this location when it is clear of humans. For bin-picking 3D cameras, the robot will approach the bin and take a picture. If a part is not in the proper position for a pick, the robot will take another image at a slightly different angle. The bin and nearby objects are programmed into the system for obstacle avoidance and the cobot will only move into these positions if there are no humans present within the safety envelope of the cobot.

A graphical user interface for programming a cobot. Courtesy of Omron Automation.


A graphical user interface for programming a cobot. Courtesy of Omron Automation.

Cobots may use whatever imaging technology is best for a given application. All types of vision inspection are available, including AI or standard rules-based inspection, line-scan, 2D, 3D, monochromatic, color, hyperspectral, or multispectral imaging. AI vision tools are very useful in applications to detect defects, especially scratches on different types of surfaces. Because this kind of defect has no fixed definition, software such as Omron’s AI Scratch Detect tool identifies defects based on a pretrained model that can find contrast when little to no contrast can be seen with a human eye. Even if the scratch becomes inverted — going from a light scratch on a dark surface to a dark scratch on a light surface — the algorithm will still find it. Other forms of inspection software look for patterns in product images based on “good” or “bad” images taught by a human programmer.

Excelitas Technologies Corp. - X-Cite Vitae  MR 11/24

Automating repetitive tasks

Collaborative robots and vision systems can automate repetitive inspection tasks that may cause ergonomic issues and job dissatisfaction for their human counterparts. Consequently, such automation helps companies harness the true potential of their human resources, reducing turnover and costs of training for new hires. In an application, a fixed camera may need a much larger field of view, reducing its effective resolution and limitations on the depth of field. By mounting a camera on the end of a cobot arm, inspections can take full advantage of resolution and special lighting (low-angle lighting to better illuminate scratches, etc.), and have more precise focus to obtain a clearer image. Surface areas can be reflective or matted. Glass surfaces, shiny metal surfaces, and shiny painted surfaces are common in the automotive industry. Flaws can be located at any angle or surface of a car and its thousands of parts. Ambient lighting cannot effectively highlight defects, especially inside door panels, trunks, glove compartments, center console storage areas, etc. These areas can also be challenging for human inspectors to access for visual observation. A small camera installed on the end of a robot arm can more easily access these locations.

In addition to inspecting large parts in an assembly process, cobots’ use of machine vision is critical to the inspection of parts in which contamination may be an issue with human workers. Examples of contamination are common in instances such as inspecting parts before and after their process in the paint room; inspecting parts inside an automobile while it is attached to a carrier; inspecting colored parts, cabling, and tinted windows; reading barcodes and 2D codes; and during optical character recognition (OCR) to ensure proper parts are used in assemblies and subassemblies. It is unproductive to have human workers complete the hundreds of time- consuming inspections that are required during the assembly of an automobile. Automated inspection greatly improves the throughput of a production line and provides a consistent, objective level of quality for every inspection. The resulting data can also be saved for each car so there is a record of its quality before it is shipped to a customer.

More than ten years ago, human inspectors would need to measure and verify everything themselves, which greatly slowed down the automation process. Humans had to use their eyes and subjective inspection abilities. They had to use calipers and other measuring equipment, often necessitating touching many parts of the car during assembly, resulting in possibly damaging or contaminating these parts. If steps were skipped in the interest of faster throughput, the risk of factory recalls would multiply. Factory recalls in various industries have greatly increased during the last ten years but are becoming easier to process due to traceability; the factory no longer needs to recall all products — only the specific ones that are affected. The cobot and vision system combination is quite capable of aiding applications in nearly all industries, including automotive and electric vehicles, medical and pharmaceutical, electronics and semiconductor, logistics and warehousing, and food and commodities. In addition to standard models, there are special cobot configurations that can meet specific requirements in certain industries. Cobots can be assembled with special food-grade grease when used in the food, beverage, and pharmaceutical industries. Cobots can also be modified to meet cleanroom standards for the medical, electronics, and semiconductor industries.

The many ways that cobots and machine vision will work together remains to be seen. Multi-arm coordinated assembly and disassembly are already functional, but they will be improved upon with full force-feedback from the collaborative robots’ safety function and by enhancing their 3D perception of the environment and work pieces. More advanced vision inspections augmented with AI are on the way, too. Soon, a robot arm will move around an object and stitch an image together from all sides. This advancement will allow 1D and 2D codes and human-readable text that is wrapped around curved surfaces to be read more easily. This technique will also be important in defect detection to easily verify whether holes were drilled all the way through an object or whether an adhesive was applied in the correct place and in the right quantity. In addition, the use of 3D safety vision for identifying the presence of humans in proximity to cobots is on the horizon. This will integrate well with the machine safety features of a cobot and allow it to go beyond the limitations of safety light curtains and scanners. This advanced functionality can help to avoid collisions with both stationary and moving objects in the workspace.

Meet the author

Renato Osaki, Omron’s product manager for motion and robots, has an MBA in marketing from Fundação Getúlio Vargas in Brazil and a post- graduate degree in automation and control engineering from Centro Universitário do Instituto Mauá de Tecnologia; email: [email protected].


Published: December 2023
Glossary
collaborative robot
Collaborative Robot (Cobot): Unlike traditional autonomous robots, which usually work by themselves, a collaborative robot is programmed to work with humans, such as a guide or an assistant. They can “learn” and “think” from the human partner based on their socially intelligent interface and physically interact with humans in a shared workspace.
machine vision
Machine vision, also known as computer vision or computer sight, refers to the technology that enables machines, typically computers, to interpret and understand visual information from the world, much like the human visual system. It involves the development and application of algorithms and systems that allow machines to acquire, process, analyze, and make decisions based on visual data. Key aspects of machine vision include: Image acquisition: Machine vision systems use various...
bin picking
Bin picking, also known as bin picking automation or bin picking robotics, refers to the automated process of selecting and retrieving individual objects or components from a bin, container, or bulk storage system using robotic systems equipped with sensors, cameras, and manipulation capabilities. Bin picking is a challenging task in automation and robotics due to the variability in object shapes, sizes, orientations, and positions within the bin. Object recognition and localization: Bin...
FeaturesCollaborative Robotcobotsmachine visionbin pickinginspection3D camerasOmronpick and place

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.