Reporting guidelines
This page evaluates the extent to which the journal article meets the criteria from two discrete-event simulation study reporting guidelines:
- Monks et al. (2019) - STRESS-DES: Strengthening The Reporting of Empirical Simulation Studies (Discrete-Event Simulation) (Version 1.0).
- Zhang, Lhachimi, and Rogowski (2020) - The generic reporting checklist for healthcare-related discrete event simulation studies derived from the the International Society for Pharmacoeconomics and Outcomes Research Society for Medical Decision Making (ISPOR-SDM) Modeling Good Research Practices Task Force reports.
STRESS-DES
Of the 24 items in the checklist:
- 18 were met fully (✅)
- 2 were partially met (🟡)
- 3 were not met (❌)
- 1 were not applicable (N/A)
Item | Recommendation | Met by study? | Evidence |
---|---|---|---|
Objectives | |||
1.1 Purpose of the model | Explain the background and objectives for the model | ✅ Fully | 1 Introduction and 2 Background - e.g. “catastrophic public health emergency”… “Fast, efficient and large-scale dispensing of such critical medical countermeasures”… “Points of Dispensing (PODSs)”… “how to develop staffing plans for each individual POD”Hernandez et al. (2015) |
1.2 Model outputs | Define all quantitative performance measures that are reported, using equations where necessary. Specify how and when they are calculated during the model run along with how any measures of error such as confidence intervals are calculated. | ✅ Fully | Outputs are throughput, waiting time and number of staff members. It gives details on these in 4.4 Mathematical programming representation - “The total waiting time for the Designees is the sum of all the waiting time of all the stations they had to visit before exiting the POD. After the simulation concludes we obtain the throughput in term of forms per hour, the average waiting time of the Designees and the total staff that was allocated.”Hernandez et al. (2015) |
1.3 Experimentation aims | If the model has been used for experimentation, state the objectives that it was used to investigate. (A) Scenario based analysis – Provide a name and description for each scenario, providing a rationale for the choice of scenarios and ensure that item 2.3 (below) is completed. (B) Design of experiments – Provide details of the overall design of the experiments with reference to performance measures and their parameters (provide further details in data below). (C) Simulation Optimisation – (if appropriate) Provide full details of what is to be optimised, the parameters that were included and the algorithm(s) that was be used. Where possible provide a citation of the algorithm(s). |
✅ | Experiments 1 to 5, described throughout 5. Experimental results . |
Logic | |||
2.1 Base model overview diagram | Describe the base model using appropriate diagrams and description. This could include one or more process flow, activity cycle or equivalent diagrams sufficient to describe the model to readers. Avoid complicated diagrams in the main text. The goal is to describe the breadth and depth of the model with respect to the system being studied. | ✅ Fully | Figure 4 |
2.2 Base model logic | Give details of the base model logic. Give additional model logic details sufficient to communicate to the reader how the model works. | ✅ Fully | Figure 4 and 2.1 PODs - e.g. “…Upon arrival at the POD, the Designee is received by a Line Manager who reviews forms to determine if they are Pre-Screened. If they are Pre-Screened, the Line Manager directs the Designee to the Express Line…”Hernandez et al. (2015) |
2.3 Scenario logic | Give details of the logical difference between the base case model and scenarios (if any). This could be incorporated as text or where differences are substantial could be incorporated in the same manner as 2.2. | ✅ Fully | Experiments 1 to 5, described throughout 5. Experimental results |
2.4 Algorithms | Provide further detail on any algorithms in the model that (for example) mimic complex or manual processes in the real world (i.e. scheduling of arrivals/ appointments/ operations/ maintenance, operation of a conveyor system, machine breakdowns, etc.). Sufficient detail should be included (or referred to in other published work) for the algorithms to be reproducible. Pseudo-code may be used to describe an algorithm. | ✅ Fully | Several algorithms mentioned relate to optimisation performed using evolutionary algorithms prior to DES. However, DES algorithms also mentioned - e.g. arrival rate following a Poisson distribution in 4.1.4 Arrival rate . |
2.5.1 Components - entities | Give details of all entities within the simulation including a description of their role in the model and a description of all their attributes. | ✅ Fully | 4.4 Mathematical programming representation - “The Designees are the entities of the DES.”2.1 PODS - “A Designee is a member of the public that arrives at a POD with one to six antibiotic screening forms representing him or herself and up to five other individuals”Hernandez et al. (2015) |
2.5.2 Components - activities | Describe the activities that entities engage in within the model. Provide details of entity routing into and out of the activity. | ✅ Fully | Implicit in Figure 4 and 2.1 PODs |
2.5.3 Components - resources | List all the resources included within the model and which activities make use of them. | ✅ Fully | Implicit in Figure 4 and 2.1 PODs |
2.5.4 Components - queues | Give details of the assumed queuing discipline used in the model (e.g. First in First Out, Last in First Out, prioritisation, etc.). Where one or more queues have a different discipline from the rest, provide a list of queues, indicating the queuing discipline used for each. If reneging, balking or jockeying occur, etc., provide details of the rules. Detail any delays or capacity constraints on the queues. | ✅ Fully | As indicated in 4.4 Mathematical programming representation , entities queue for each station, and no queueing discipline is mentioned, so it seems reasonable to assume it is simply first in first out, and so do not feel this is “missing” anything in this regard, as it is clear people wait, and I feel this to then be the assumed default. |
2.5.5 Components - entry/exit points | Give details of the model boundaries i.e. all arrival and exit points of entities. Detail the arrival mechanism (e.g. ‘thinning’ to mimic a non-homogenous Poisson process or balking) | ✅ Fully | Evident in Figure 4 . |
Data | |||
3.1 Data sources | List and detail all data sources. Sources may include: • Interviews with stakeholders, • Samples of routinely collected data, • Prospectively collected samples for the purpose of the simulation study, • Public domain data published in either academic or organisational literature. Provide, where possible, the link and DOI to the data or reference to published literature. All data source descriptions should include details of the sample size, sample date ranges and use within the study. |
✅ Fully | Mentioned thoughout 4.1 Inputs - for example…4.1.1 Splits - “Research indicating proportion of the population with certain allergies and/or contraindications helped determine the percentage of people who may move through various paths in the POD”4.1.2 Number of forms - “The percentage of Designees that arrive with one to six forms was based on 2000 census data on household sizes in New York City (Census 2000 Public Use Microdata Sample 5% Sample files, NY and NJ)”Hernandez et al. (2015) |
3.2 Pre-processing | Provide details of any data manipulation that has taken place before its use in the simulation, e.g. interpolation to account for missing data or the removal of outliers. | N/A | None mentioned and, as can’t know otherwise, assumed to be not applicable. |
3.3 Input parameters | List all input variables in the model. Provide a description of their use and include parameter values. For stochastic inputs provide details of any continuous, discrete or empirical distributions used along with all associated parameters. Give details of all time dependent parameters and correlation. Clearly state: • Base case data • Data use in experimentation, where different from the base case. • Where optimisation or design of experiments has been used, state the range of values that parameters can take. • Where theoretical distributions are used, state how these were selected and prioritised above other candidate distributions. |
✅ Fully | 4.1 Inputs , Table 1 , and Table 2 |
3.4 Assumptions | Where data or knowledge of the real system is unavailable what assumptions are included in the model? This might include parameter values, distributions or routing logic within the model. | ✅ Fully | 4.1.2 Number of forms - “we assume that no Designee picks up countermeasures for anyone outside of his/her household as defined by the US Census”4.6. Model assumptions - e.g. “It is assumed that POD staff doe not make any mistakes filling out the forms or dispensing the MCM…”Hernandez et al. (2015) |
Experimentation | |||
4.1 Initialisation | Report if the system modelled is terminating or non-terminating. State if a warm-up period has been used, its length and the analysis method used to select it. For terminating systems state the stopping condition. State what if any initial model conditions have been included, e.g., pre-loaded queues and activities. Report whether initialisation of these variables is deterministic or stochastic. |
❌ | Doesn’t report, but model is understood to be non-terminating, running only for 1 hour. Not clear if there are warm-up or initialisation conditions. |
4.2 Run length | Detail the run length of the simulation model and time units. | ✅ Fully | 4.3 Processing - “The Discrete Event Simulation time is set to an hour”, and throughout mentions “minutes” (which is the time unit) |
4.3 Estimation approach | State the method used to account for the stochasticity: For example, two common methods are multiple replications or batch means. Where multiple replications have been used, state the number of replications and for batch means, indicate the batch length and whether the batch means procedure is standard, spaced or overlapping. For both procedures provide a justification for the methods used and the number of replications/size of batches. | ✅ Fully | 4.3 Processing - “three simulation runs to obtain reliable estimates”Justification for number given by Figure 10 from Experiment 5 Hernandez et al. (2015) |
Implementation | |||
5.1 Software or programming language | State the operating system and version and build number. State the name, version and build number of commercial or open source DES software that the model is implemented in. State the name and version of general-purpose programming languages used (e.g. Python 3.5). Where frameworks and libraries have been used provide all details including version numbers. |
🟡 | Doesn’t mention operating system. Does give the other information though in Figure 3 - Python 2.7 with inspyred 1.0 and simpy 2.3.1 and R 2.15.3 with ggplot 0.9.3 |
5.2 Random sampling | State the algorithm used to generate random samples in the software/programming language used e.g. Mersenne Twister. If common random numbers are used, state how seeds (or random number streams) are distributed among sampling processes. |
❌ | Not mentioned |
5.3 Model execution | State the event processing mechanism used e.g. three phase, event, activity, process interaction. Note that in some commercial software the event processing mechanism may not be published. In these cases authors should adhere to item 5.1 software recommendations. State all priority rules included if entities/activities compete for resources. If the model is parallel, distributed and/or use grid or cloud computing, etc., state and preferably reference the technology used. For parallel and distributed simulations the time management algorithms used. If the HLA is used then state the version of the standard, which run-time infrastructure (and version), and any supporting documents (FOMs, etc.) |
❌ | Cannot see any of this mentioned |
5.4 System specification | State the model run time and specification of hardware used. This is particularly important for large scale models that require substantial computing power. For parallel, distributed and/or use grid or cloud computing, etc. state the details of all systems used in the implementation (processors, network, etc.) | 🟡 | Doesn’t describe hardware, but does mention some of the run times - 5.3 Experiment 3 - “The model with a population of 50 and 25 generations took 1.8 hours. The model with a population of 100 and 50 generations took 6.5 hours, whereas the one with a population of 200 and 100 generations took 27 hours.”Hernandez et al. (2015) |
Code access | |||
6.1 Computer model sharing statement | Describe how someone could obtain the model described in the paper, the simulation software and any other associated software (or hardware) needed to reproduce the results. Provide, where possible, the link and DOIs to these. | ✅ Fully | 4.4 Mathematical programming representation - “The complete source code used to perform these experiments can be found on github (https://github.com/ivihernandez/staff-allocation).”Hernandez et al. (2015) |
DES checklist derived from ISPOR-SDM
Of the 18 items in the checklist:
- 10 were met fully (✅)
- 7 were not met (❌)
- 1 was not applicable (N/A)
Item | Assessed if… | Met by study? | Evidence/location |
---|---|---|---|
Model conceptualisation | |||
1 Is the focused health-related decision problem clarified? | …the decision problem under investigation was defined. DES studies included different types of decision problems, eg, those listed in previously developed taxonomies. | ✅ Fully | 1 Introduction and 2 Background - e.g. “catastrophic public health emergency”… “Fast, efficient and large-scale dispensing of such critical medical countermeasures”… “Points of Dispensing (PODSs)”… “how to develop staffing plans for each individual POD”Hernandez et al. (2015) |
2 Is the modeled healthcare setting/health condition clarified? | …the physical context/scope (eg, a certain healthcare unit or a broader system) or disease spectrum simulated was described. | ✅ Fully | 1 Introduction and 2 Background - New York City |
3 Is the model structure described? | …the model’s conceptual structure was described in the form of either graphical or text presentation. | ✅ Fully | Figure 4 and 2.1 PODs - e.g. “…Upon arrival at the POD, the Designee is received by a Line Manager who reviews forms to determine if they are Pre-Screened. If they are Pre-Screened, the Line Manager directs the Designee to the Express Line…”Hernandez et al. (2015) |
4 Is the time horizon given? | …the time period covered by the simulation was reported. | ✅ Fully | 4.3 Processing - “The Discrete Event Simulation time is set to an hour” |
5 Are all simulated strategies/scenarios specified? | …the comparators under test were described in terms of their components, corresponding variations, etc | ✅ Fully | Experiments 1 to 5, described throughout 5. Experimental results . |
6 Is the target population described? | …the entities simulated and their main attributes were characterized. | ✅ Fully | In this case, it is quite a hypothetical situation, and there doesn’t feel to be a lot that needs descibing? The only attribute we are interested in for the entities is how many forms they have and whether they would complete beforehand, and this information is given in e.g. 4.1.2 based on census data of household sizes. In 1 Introduction , they describe the situation relavant to this population, and the plans from the Cities Readiness Initiative. |
Paramaterisation and uncertainty assessment | |||
7 Are data sources informing parameter estimations provided? | …the sources of all data used to inform model inputs were reported. | ✅ Fully | Mentioned thoughout 4.1 Inputs - for example…4.1.1 Splits - “Research indicating proportion of the population with certain allergies and/or contraindications helped determine the percentage of people who may move through various paths in the POD”4.1.2 Number of forms - “The percentage of Designees that arrive with one to six forms was based on 2000 census data on household sizes in New York City (Census 2000 Public Use Microdata Sample 5% Sample files, NY and NJ)”Hernandez et al. (2015) |
8 Are the parameters used to populate model frameworks specified? | …all relevant parameters fed into model frameworks were disclosed. | ✅ Fully | 4.1 Inputs , Table 1 , and Table 2 |
9 Are model uncertainties discussed? | …the uncertainty surrounding parameter estimations and adopted statistical methods (eg, 95% confidence intervals or possibility distributions) were reported. | ✅ Fully | The scatter plots include all the results from the run. Where average results are reported though (e.g. Table 3 , Figure 10 ), they do include confidence intervals along with the average. |
10 Are sensitivity analyses performed and reported? | …the robustness of model outputs to input uncertainties was examined, for example via deterministic (based on parameters’ plausible ranges) or probabilistic (based on a priori-defined probability distributions) sensitivity analyses, or both. | ❌ Not met | None mentioned. |
Validation | |||
11 Is face validity evaluated and reported? | …it was reported that the model was subjected to the examination on how well model designs correspond to the reality and intuitions. It was assumed that this type of validation should be conducted by external evaluators with no stake in the study. | ❌ Not met | Not mentioned. |
12 Is cross validation performed and reported | …comparison across similar modeling studies which deal with the same decision problem was undertaken. | ❌ Not met | Not mentioned. |
13 Is external validation performed and reported? | …the modeler(s) examined how well the model’s results match the empirical data of an actual event modeled. | ❌ Not met | Not mentioned. |
14 Is predictive validation performed or attempted? | …the modeler(s) examined the consistency of a model’s predictions of a future event and the actual outcomes in the future. If this was not undertaken, it was assessed whether the reasons were discussed. | N/A | Only relevant to forecasting. |
Generalisability and stakeholder involvement | |||
15 Is the model generalizability issue discussed? | …the modeler(s) discussed the potential of the resulting model for being applicable to other settings/populations (single/multiple application). | ❌ Not met | The model is developed for New York City planners. No mention is made of applying it outside this context. |
16 Are decision makers or other stakeholders involved in modeling? | …the modeler(s) reported in which part throughout the modeling process decision makers and other stakeholders (eg, subject experts) were engaged. | ❌ Not met | Not mentioned. |
17 Is the source of funding stated? | …the sponsorship of the study was indicated. | ❌ Not met | Was not able to find any funding statements. |
18 Are model limitations discussed? | …limitations of the assessed model, especially limitations of interest to decision makers, were discussed. | ✅ Fully | 8. Limitations - e.g. “Functional exercises like those conducted to gather very specific data on the Screening, Dispensing and Flow Monitor functions are not indicative of an actual POD. In real-world operations, POD sta- tions do not operate independently of each other. POD operations are heavily affected by how Stations operate together to facilitate smooth, efficient movement of people”Hernandez et al. (2015) |
References
Hernandez, Ivan, Jose E. Ramirez-Marquez, David Starr, Ryan McKay, Seth Guthartz, Matt Motherwell, and Jessica Barcellona. 2015. “Optimal Staffing Strategies for Points of Dispensing.” Computers & Industrial Engineering 83 (May): 172–83. https://doi.org/10.1016/j.cie.2015.02.015.
Monks, Thomas, Christine S. M. Currie, Bhakti Stephan Onggo, Stewart Robinson, Martin Kunc, and Simon J. E. Taylor. 2019. “Strengthening the Reporting of Empirical Simulation Studies: Introducing the STRESS Guidelines.” Journal of Simulation 13 (1): 55–67. https://doi.org/10.1080/17477778.2018.1442155.
Zhang, Xiange, Stefan K. Lhachimi, and Wolf H. Rogowski. 2020. “Reporting Quality of Discrete Event Simulations in Healthcare—Results From a Generic Reporting Checklist.” Value in Health 23 (4): 506–14. https://doi.org/10.1016/j.jval.2020.01.005.