Abstract

An information-based design principle is presented that provides a framework for the design of both parallel and sequential algorithms. In this presentation, the notion of information (data) organization and canonical separation are examined and used in the design of an iterative line method for pattern grouping. In addition this technique is compared to the Winner Take All (WTA) method and shown to have many advantages.

Highlights

  • Information is the basic building block of all processes whether biological or physical in nature

  • The neural network community has moved quite far from the anticipation that the science of neural networks might solve the fascinating mystery of the functional operation of the brain, the introduction of the artificial neural networks (ANNs) into the science of optimization techniques has had a serious impact on the solution of intractable problems

  • The information operator associated with the perceptron-learning algorithm is separated into two independent components and used in a non-adaptive formulation that defines an ANN architecture with unambiguous number of nodes per translation and rotation layers

Read more

Summary

INTRODUCTION

Information is the basic building block of all processes whether biological or physical in nature. There have been some disagreements [2,3] to IBC's contribution from the point of view of some in the numerical analysis (NA) community, IBC introduces the notion of information operators, where information is partially derived and used by a computation (an algorithm A that defines the information-based solution method) to solve a problem. Within the context of the IBC representation, the introduction of the information operator and information operations represents a novel and attractive approach to algorithm analysis and design in general, and speaks to a broader possible application than originally intended. We explore this question, and in so doing provide an example where the analysis of information flow or the use of information operators when placed in a form of a canonically mapped information flow may yields more optimal algorithmic designs when possible

CANONICAL INFORMATION FLOW
BACKGROUND
PERCEPTRON
CANONICAL PERCEPTRON MODEL
PARALLEL ARCHITECTURE
ORTHOGONALITY
HIGHER-DIMENSIONAL PROPERTIES
10. ARCHITECTURAL STRUCTURE
11. CONCLUSION
12. REFERENCES
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call