Linear time delay model for studying load balancing instabilities in parallel computations

C. T. Abdallah, N. Alluri, J. D. Birdwell, J. Chiasson, V. Chupryna, Z. Tang, T. Wang

Research output: Contribution to journalArticlepeer-review

23 Scopus citations

Abstract

A linear time-delay system is proposed to model load balancing in a cluster of computer nodes used for parallel computations. The linear model is analysed for stability in terms of the delays in the transfer of information between nodes and the gains in the load balancing algorithm. This model is compared with an experimental implementation of the algorithm on a parallel computer network.

Original languageEnglish
Pages (from-to)563-573
Number of pages11
JournalInternational Journal of Systems Science
Volume34
Issue number10-11
DOIs
StatePublished - 15 Aug 2003

Fingerprint

Dive into the research topics of 'Linear time delay model for studying load balancing instabilities in parallel computations'. Together they form a unique fingerprint.

Cite this