Research output per year
Research output per year
Ralf Josef Johanna Beerens, Henrik Tehler, Ben Pelzer
Research output: Contribution to journal › Article › peer-review
The evaluation of simulated disasters (for example, exercises) and real responses are important activities. However, little attention has been paid to how reports documenting such events should be written. A key issue is how to make them as useful as possible to professionals working in disaster risk management. Here, we focus on three aspects of a written evaluation: how the object of the evaluation is described, how the analysis is described, and how the conclusions are described. This empirical experiment, based on real evaluation documents, asked 84 Dutch mayors and crisis management professionals to evaluate the perceived usefulness of the three aspects noted above. The results showed that how evaluations are written does matter. Specifically, the usefulness of an evaluation intended for learning purposes is improved when its analysis and conclusions are clearer. In contrast, evaluations used for accountability purposes are only improved by the clarity of the conclusion. These findings have implications for the way disaster management evaluations should be documented.
Original language | English |
---|---|
Pages (from-to) | 578-591 |
Number of pages | 14 |
Journal | International Journal of Disaster Risk Science |
Volume | 11 |
Issue number | 5 |
Early online date | 2020 Jul 10 |
DOIs | |
Publication status | Published - 2020 Oct |
Research output: Thesis › Doctoral Thesis (compilation)