Optimal ambulance dispatching

Research output: Chapter in Book / Report / Conference proceedingChapterAcademicpeer-review

Abstract

This chapter considers the ambulance dispatch problem, in which one must decide which ambulance to send to an incident in real time. In practice as well as in literature, it is commonly believed that the closest idle ambulance is the best choice. This chapter describes alternatives to the classical closest idle ambulance rule. Our first method is based on a Markov decision problem (MDP), which
constitutes the first known MDP model for ambulance dispatching. Moreover, in
the broader field of dynamic ambulance management, this is the first MDP that
captures more than just the number of idle vehicles, while remaining computationally tractable for reasonably-sized ambulance fleets. We analyze the policy obtained from this MDP, and transform it to a heuristic for ambulance dispatching that can handle the real-time situation more accurately than our MDP states can describe. We evaluate our policies by simulating a realistic emergency medical services region in the Netherlands. For this region, we show that our heuristic reduces the fraction of late arrivals by 13% compared to the “closest idle” benchmark policy. This result sheds new light on the popular belief that deviating from the closest idle dispatch policy cannot greatly improve the objective.
Original languageEnglish
Title of host publicationMarkov Decision Processes in Practice
EditorsNico van Dijk, Richard Boucherie
PublisherSpringer_Verlag
Pages269-291
Number of pages23
ISBN (Electronic)978-3-319-47766-4
ISBN (Print)978-3-319-47764-0
DOIs
Publication statusPublished - 2017

Fingerprint

Dive into the research topics of 'Optimal ambulance dispatching'. Together they form a unique fingerprint.

Cite this