John S. Baras

1981

Stochastic Control of Two Partially Observed Competing Queues

J.S. Baras and A.J. Dorsey

IEEE Transactions on Automatic Control, AC-26, Special Issue in honor of R. Bellman, pp. 1105-1117, October 1981.

Full Text Paper ( .Pdf )

Abstract

We consider the dynamic control of two queues competing for the services of one server. The problem is to design a server time allocation strategy, when the sizes of the queues are not observable. The performance criterion used is total expected aggregate delay. The server is assumed to observe arrivals but not departures.

This problem is formulated as a stochastic optimal control problem with partial observations. The framework we adopt is that of stochastic control in discrete time and countable "state space." The observations are modeled as discrete time, 0-1 point process with rates that are influenced by a Markov chain. Examples from computer control of urban traffic are given, to illustrate the practical motivation behind the present work, and to relate to earlier work by us on the subject. A particular feature of the formulation is that the observations are influenced by transitions of the state of the Markov chain. The classical tools of simple Bayes rule and dynamic programming suffice for the analysis. In particular, we show that the "one step" predicted density for the state of the Markov chain, given the point process observations is a sufficient statistic for control.

This framework is then applied to the specific problem of two queues competing for the services of one server. We obtain explicit solutions for the finite time expected aggregate delay problem. The implications of these results for practical applications as well as implementation aspects of the resulting optimal control laws are discussed.

Biography | Site Map | Contact Dr. Baras | Send Feedback | ©2005 ISR