Abstract

Three striking and fundamental characteristics of human shape recognition are its invariance with viewpoint in depth (including scale), its tolerance of unfamiliarity, and its robustness with the actual contours present in an image (as long as the same convex parts [geons] can be activated). These characteristics are expressed in an implemented neural network model (Hummel & Biederman, 1992) that takes a line drawing of an object as input and generates a structural description of geons and their relations which is then used for object classification. The model's capacity for structural description derives from its solution to the dynamic binding problem of neural networks: independent units representing an object's parts (in terms of their shape attributes and interrelations) are bound temporarily when those attributes occur in conjunction in the system's input. Temporary conjunctions of attributes are represented by synchronized activity among the units representing those attributes. Specifically, the model induces temporal correlation in the firing of activated units to: (1) parse images into their constituent parts; (2) bind together the attributes of a part; and (3) determine the relations among the parts and bind them to the parts to which they apply. Because it conjoins independent units temporarily, dynamic binding allows tremendous economy of representation, and permits the representation to reflect an object's attribute structure. The model's recognition performance conforms well to recent results from shape priming experiments. Moreover, the manner in which the model's performance degrades due to accidental synchrony produced by an excess of phase sets suggests a basis for a theory of visual attention.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.