Search
Menu
Excelitas Technologies Corp. - X-Cite Vitae LB 11/24

Machine Vision, Artificial Nose Combine to Monitor Cooked Chicken

Facebook X LinkedIn Email
MOSCOW, Jan. 22, 2021 — Skoltech (Russia) scientists are working to combine machine vision with an artificial nose to ensure the proper level of doneness for cooked chicken. The technology aims to help restaurants monitor and automate cooking processes.

In pursuit of the perfect chicken, the researchers employed an industrial camera and an array of sensors (called an e-nose) designed to detect the presence of certain components of an odor. These devices monitor the chicken as it cooks — looking and smelling to determine when it is fully cooked.

The work evolved from a student project at the lab by Ainul Yaqin, the co-author of the research paper. Yaqin traveled to Novosibirsk to test the ability of chemical sensors developed by the lab to monitor the effectiveness of industrial filters in restaurant ventilation. That project led to experiments with the smell profile of grilled chicken.

“At the same time to determine the proper doneness state, one cannot rely on ‘e-nose’ only, but have to use computer vision — these tools give you a so-called  electronic panel (a panel of electronic ‘experts’),” said Albert Nasibulin, a professor at Skoltech and Aalto University (Finland). “Building on the great experience in computer vision techniques of our colleagues from Skoltech CDISE, together we tested the hypothesis that, when combined, computer vision and electronic nose provide more precise control over the cooking.” 

The researchers combined techniques to accurately observe chicken as it cooked, without contact. The researchers chose chicken because of its global popularity, grilling it extensively to “train” their instruments to evaluate and predict how well it was cooked.

“Images of grilled chickens were obtained using the DFK 33UX250 industrial camera,” said Fedor Fedorov, senior research scientist at Skoltech’s Center for Photonics and Quantum Materials. “We employed the RGB color model in our analysis. It is a color model with 8-bit images, and an integer number is determined for each pixel in the range of 0 to 255. RGB color depends on a mixture model where colors are created using the combination of these colors, namely red (R), green (G), and blue (B). RGB values were taken as features.”

The researchers tested other dimensionality reduction techniques such as linear discriminant analysis, latent Dirichlet allocation, and t-distributed stochastic neighbor embedding for the analysis of obtained images.

Excelitas PCO GmbH - Industrial Camera 11-24 VS MR

The e-nose comprised eight sensors that detected smoke, alcohol, carbon monoxide, and other compounds, as well as temperature and humidity; the researchers placed the e-nose inside the ventilation system. They fed photos taken by the camera into an algorithm that looks at data patterns, and, to define changes in odor consistent with the various stages of the grilling process, the researchers made use of thermogravimetric analysis to monitor the number of volatile particles for the e-nose to detect, differential mobility analysis to measure the size of aerosol particles, and mass spectrometry.

The researchers employed 16 Ph.D. students and researchers to taste-test the grilled chicken breast to rate its tenderness, juiciness, intensity of flavor, appearance, and overall doneness on a 10-point scale. This data was matched to the analytical results to test doneness against the humans’ perception.

Using these techniques, the team reported that their system accurately identified undercooked, well-cooked, and overcooked chicken breast. For the system to work with other cuts of the chicken, the researchers would need to retrain the system on new data.

“We believe we can use other techniques of data handling, i.e., artificial neural networks. Also, the application of a multispectral camera might help to improve the results. We can also consider high-level data fusion, while in the paper, low-level data fusion was used,” Fedorov told Photonics Media.

The researchers plan to test their sensors in restaurant kitchen environments. Another potential application could be in sniffing out rotten meat at its early stages of spoilage, when changes in the smell profile are too subtle for human perception.

“We believe these systems can be integrated into industrial kitchens and even in usual kitchens as a tool that can help and advise about the doneness degree of your meat, when direct temperature measurement is not possible or not effective,” Fedorov said.

The research was published in Food Chemistry (www.doi.org/10.1016/j.foodchem.2020.128747).

 


Published: January 2021
Glossary
machine vision
Machine vision, also known as computer vision or computer sight, refers to the technology that enables machines, typically computers, to interpret and understand visual information from the world, much like the human visual system. It involves the development and application of algorithms and systems that allow machines to acquire, process, analyze, and make decisions based on visual data. Key aspects of machine vision include: Image acquisition: Machine vision systems use various...
Research & Technologymachine visionquality controlfood qualityfood quality controlFood safetyfood safety inspectionautomated inspectionindustrial cameraImaginge-noseEuropeSkoltechSkolkovo Institute of Science and Technology

We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.