Problems that seem to be encountered, time and time again, by the practicing OR analyst include: 1. (i) resource allocation (i.e. continuous and discrete optimization); 2. (ii) classification (i.e. pattern recognition, or discriminant analysis); 3. (iii) prediction/estimation; and 4. (iv) clustering. When faced with such problems, the OR analyst has a number of conventional methods to choose from, such as: linear programming, discrete optimization, statistically based discriminant analysis, regression and cluster analysis. In this paper we will address yet another alternative: the employment of neural networks. While certainly no panacea, the neural network approach may, in certain instances, offer advantages that range from minor to substantial. Further, a neural network approach permits solution by means of parallel processing, thus providing certain unique and significant advantages that are inherent to distributed computing. As such, the OR analyst who remains unfamiliar with this approach cannot, we believe, consider himself or herself to be fully prepared for the most effective solution of a variety of problems, both now and in the future. In this paper, we shall introduce the neural network approach, from an OR perspective—and indicate just where and how such a tool might find application.
Read full abstract