A time delay model for load balancing with processor resource constraints

Zhong Tang, J. Douglas Birdwell, John Chiasson, Chaouki T. Abdallah, Majeed M. Hayat

Research output: Contribution to journalConference articlepeer-review

10 Scopus citations

Abstract

A deterministic dynamic nonlinear time-delay system is developed to model load balancing in a cluster of computer nodes used for parallel computations. This model refines a model previously proposed by the authors to account for the fact that the load balancing operation involves processor time which cannot be used to process tasks. Consequently, there is a trade-off between using processor time/network bandwidth and the advantage of distributing the load evenly between the nodes to reduce overall processing time. The new model is shown to be self consistent in that the queue lengths cannot go negative and the total number of tasks in all the queues is conserved (i.e., load balancing can neither create nor lose tasks). It is shown that the proposed model is (Lyapunov) stable for any input, but not necessarily asymptotically stable. Experimental results are presented and compared with the predicted results from the analytical model. In particular, simulations of the models are compared with an experimental implementation of the load balancing algorithm on a parallel computer network.

Original languageEnglish
Article numberFrA03.1
Pages (from-to)4193-4198
Number of pages6
JournalProceedings of the IEEE Conference on Decision and Control
Volume4
StatePublished - 2004
Event2004 43rd IEEE Conference on Decision and Control (CDC) - Nassau, Bahamas
Duration: 14 Dec 200417 Dec 2004

Keywords

  • Load balancing
  • Networks
  • Time Delays

Fingerprint

Dive into the research topics of 'A time delay model for load balancing with processor resource constraints'. Together they form a unique fingerprint.

Cite this