Abstract

This paper presents a study related to distributed processing systems. An attempt is made to calculate the timeout interval online and adaptively with varying system conditions, and improve the decision-making following a timeout condition. The proposed environment supports either: centralized or decentralized client-server systems. The work is comprised of two phases. In the first phase, a multiple regression model is built to calculate the parameters of the regression equations for timeouts corresponding to the different phases of a client server scenario: probing, execution, and termination. The second phase is an online calculation of timeout intervals by using state information, and pre-calculated regression parameters. Decision making at a timeout instant is based on the application of the Bayesian decision theory to reach appropriate decisions (BD making under risk) , and the state information is used to calculate posterior state probabilities dynamically. The state information is also used in dynamically calculating payoff values for different actions and states for the one scenario. It is shown via a simulation study that better decisions are reached with continual dynamic calculations of system parameters. The approach is useful and can be practically used in any time-critical or decision-based application.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call