Abstract
AbstractWe develop a machine vision system as a key component of a robot‐assisted packaging system, which can guide the robot arms to pack the roast sauries into cans. For gripping strategy generation, the system is required not only to be able to detect the roast saury area but also estimate the geometric parameters. Besides, according to different canning requirements, it is also necessary to distinguish the type of fish parts. Facing these challenges, we propose a novel rule‐based matching method combined with an improved efficient graph‐based image segmentation (EGIS) method for sensing the fish part. Specifically, the matching method applies our originally designed rule‐based similarity under a genetic algorithm framework combined with deterministic crowding technique, which is used to sensing one type of fish parts. On the other hand, we improve the EGIS by introducing a shape restriction to deal with leftover fish parts. The experiments are implemented for two different types of fish part in the real factory environment. The result of our method achieved a mean location accuracy of 93.5% with a practical average processing time of 2.6 s per image. © 2020 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEJ Transactions on Electrical and Electronic Engineering
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.