Optimal ambulance dispatching

Auteur(s)
Jagtenberg, C.J.; Bhulai, S.; Mei, R.D. van der
Jaar

This chapter considers the ambulance dispatch problem, in which one must decide which ambulance to send to an incident in real time. In practice as well as in literature, it is commonly believed that the closest idle ambulance is the best choice. This chapter describes alternatives to the classical closest idle ambulance rule. The first method is based on a Markov decision problem (MDP), which constitutes the first known MDP model for ambulance dispatching. Moreover, in the broader field of dynamic ambulance management, this is the first MDP that captures more than just the number of idle vehicles, while remaining computationally tractable for reasonably-sized ambulance fleets. The authors analyze the policy obtained from this MDP, and transform it to a heuristic for ambulance dispatching that can handle the real-time situation more accurately than our MDP states can describe. They evaluate our policies by simulating a realistic emergency medical services region in the Netherlands. For this region, they show that our heuristic reduces the fraction of late arrivals by 13% compared to the 'closest idle' benchmark policy. This result sheds new light on the popular belief that deviating from the closest idle dispatch policy cannot greatly improve the objective.

Publicatie aanvragen

2 + 5 =
Los deze eenvoudige rekenoefening op en voer het resultaat in. Bijvoorbeeld: voor 1+3, voer 4 in.
Pagina's
269-291
Verschenen in
Markov decision processes in practice
Serie
International Series in Operations Research & Management Science ; 248
ISBN
978-3-319-47764-0 (hbk) / 978-3-319-83817-5 (pbk)
ISSN
0884-8289
Bibliotheeknummer
20220236 ST [electronic version only]
Gepubliceerd door
Springer Cham

Onze collectie

Deze publicatie behoort tot de overige publicaties die we naast de SWOV-publicaties in onze collectie hebben.