A flexible, adaptive, and programmable sensory control system for robotic acquisition of jumbled parts is described. The system consists of inde-pendent modules for vision and gripper control that guide a robot to acquire unoriented parts from a bin. The acquired parts are transported to a fixture and placed according to a desired alignment. The parts in the bin have either elongated or spherical symmetry, and all these parts are identical except for minor dimensional and textural variations. These objects may be colored or contain graphics, possess variable cross sections across their length, and have mixed material content and reflectivities. The system operates by taking a high resolution gray-scale image of the bin of parts using an overhead camera under ambient illumination. The overlapping parts in the image are separated by selectively removing the high spatial frequencies at the touching or occluding edges. These portions of the image are removed by dynamic range expansion, histogram-based adaptive image enhancement, and nonlinear homomorphic filtering. Algorithms are developed for fast and iterative data compression and segmentation and for estimation of the location and orientation of a cluster, irrespective of its shape. The photobeam, collision, and pressure sensors on the parallel jaw gripper are monitored during the process of parts acquisition and transport of experimental workpieces such as fuel filters, curved plier blanks, multicolored felt pens, L-shaped fuel links, ball bearings, and a mixture of shiny, semirusted, threaded, and unthreaded bolts of various lengths. In one image-processing cycle of 2 s or less, this system can compute locations and orientations of three objects at a successful acquisition rate of approximately 95%.
Read full abstract