Optimal ambulance dispatching

Author(s)
Jagtenberg, C.J.; Bhulai, S.; Mei, R.D. van der
Year

This chapter considers the ambulance dispatch problem, in which one must decide which ambulance to send to an incident in real time. In practice as well as in literature, it is commonly believed that the closest idle ambulance is the best choice. This chapter describes alternatives to the classical closest idle ambulance rule. The first method is based on a Markov decision problem (MDP), which constitutes the first known MDP model for ambulance dispatching. Moreover, in the broader field of dynamic ambulance management, this is the first MDP that captures more than just the number of idle vehicles, while remaining computationally tractable for reasonably-sized ambulance fleets. The authors analyze the policy obtained from this MDP, and transform it to a heuristic for ambulance dispatching that can handle the real-time situation more accurately than our MDP states can describe. They evaluate our policies by simulating a realistic emergency medical services region in the Netherlands. For this region, they show that our heuristic reduces the fraction of late arrivals by 13% compared to the 'closest idle' benchmark policy. This result sheds new light on the popular belief that deviating from the closest idle dispatch policy cannot greatly improve the objective.

Request publication

2 + 0 =
Solve this simple math problem and enter the result. E.g. for 1+3, enter 4.
Pages
269-291
Published in
Markov decision processes in practice
Series
International Series in Operations Research & Management Science ; 248
ISBN
978-3-319-47764-0 (hbk) / 978-3-319-83817-5 (pbk)
ISSN
0884-8289
Library number
20220236 ST [electronic version only]
Publisher
Springer Cham

Our collection

This publication is one of our other publications, and part of our extensive collection of road safety literature, that also includes the SWOV publications.