Paula M. Powell, Senior Editor
With the advent of graphical programming software, the need to write and debug countless lines of text-based code is no longer an issue in even the most complex multicamera environments. In the pharmaceutical industry, this advanced software can help bring an inspection process online quickly.
The key to optimizing vision-based tasks lies in understanding the process to be inspected. Kyle Voosen, machine vision marketing engineer with National Instruments Corp. in Austin, Texas, reports that critical inspection tasks tend to fall into two categories: compound analysis and pharmaceutical packaging. In either case, multiple inspection steps are often necessary. For example, consider a blister pack of pills. Before the pills are inserted, packs are checked to ensure that no bubbles are depressed; later, the packs are rechecked before the foil backing is sealed in place, and another inspection may be necessary to check the foil seal. Several steps, Voosen said, can be monitored using gray-scale vision technology. Others, such as the final inspection, could require color camera technology to check that the pills are of the right type.
Blister pack inspection is often a multistep process at various stages on the packaging line.
Some observers, including Jason Mulliner, National Instruments’ product marketing manager for machine vision, also see IR camera technology gaining acceptance on the inspection line; for example, to check IV bags for fluid leaks.
Another emerging application involves monitoring the homogeneity of drug components. Bengt Lagerholm, senior engineer with AstraZeneca in Molndal, Sweden, recently used National Instruments’ LabView graphical programming software to develop a real-time method for estimating the blending process. Engineers visualized chemical content using a near-IR CCD camera with a spectral sensitivity of 900 to 1700 nm, relying on a bandpass optical filter in front of the camera lens to increase the selectivity to components of interest. The area of inspection in the batch container was determined with a support feature in the graphical programming toolkit.
The technique replaces another method still being used in the industry, which samples a batch with a tiny spot of near-IR light and extrapolates that data to the complete mixture. The near-IR camera offers a much larger imaging area and, hence, more data. The researchers resolved the image analysis issue associated with this data volume by using vision software to correlate each pixel value in the picture to the near-IR reflectance of the components, then summarizing the chemical information provided using histograms. Next, they extracted blending process information from the data matrix with principal component analysis, a multivariate projection method that extracts and highlights systematic variation in a data matrix. After initial start-up of the blending process, this feedback reaches a steady state. If it starts to deviate, Lagerholm and colleagues can recheck the process. The entire monitoring can be handled with personal computers.
Ultimately, Voosen and Mulliner believe that success in any vision-based inspection task depends on how well one can prototype the process. “No one knows all the time which algorithms will eventually be required to get the inspection to work,” Voosen said. “The best way to do that is to set up a prototype run with the graphical programming software, then play with the system and get it to work.”
He noted that the key to easy prototyping lies in the configuration software, which should also enable modifications to the vision programming throughout the life of the process. Engineers must be able to tweak the vision system’s algorithms quickly, to adjust for common variables such as changes in lighting.