Search
Menu
Sheetak -  Cooling at your Fingertip 11/24 LB

Human-Like Vision for Pick-and-Place Robotics

Jul 21, 2021
Facebook X LinkedIn Email
TO VIEW THIS WEBINAR:
Login  Register
About This Webinar
Achieving full SKU coverage and fewer mispicks, and completely emptying a bin of random objects, are success factors that may be hard to achieve if an automation system is based on stereo depth sensors that are low end or slow. Raman Sharma provides several before-and-after examples, as well as the structured light path for improving the rates of success for detecting, picking, and placing most objects in manufacturing and logistics. With the mantra "See more. Do more," this session covers colored, reflective, shiny, small, and large parts to demonstrate how AI and detection algorithms and a robotic piece-picking system can use human-like vision to finally achieve the goal of universal picking and placing.

***This presentation premiered during the 2021 Vision Spectra Conference. For more information on Photonics Media conferences, visit events.photonics.com.

About the presenter:
Raman Sharma Raman Sharma is responsible for sales and marketing for Zivid in the Americas. He is passionate about technology and building businesses from the ground up. Zivid is his third startup in his 20-year career. Sharma holds bachelor's and master's degrees in electrical and computer engineering from Carnegie Mellon University and an MBA from the Kellogg School of Management at Northwestern University.
artificial intelligenceautomationmachine visionroboticsVision Spectraindustrialpick and place
We use cookies to improve user experience and analyze our website traffic as stated in our Privacy Policy. By using this website, you agree to the use of cookies unless you have disabled them.