Evaluated study against guidelines (badges, sharing artefacts, and starting on STRESS-DES reporting).
Work log
Badges
Evaluating artefacts as in https://zenodo.org/records/3760626 and https://git.exeter.ac.uk/tmwm201/dialysis-service-delivery-covid19.
Uncertainties:
execute
: “Scripts can be successfully executed” (and thenregenerated
: “Independent party regenerated results using the authors research artefacts”)- Required some minor changes to environment in order for scripts to run
- However, not certain whether the badges would allow minor troubleshooting (or major troubleshooting) in order for a script to run?
- By my criteria (allowing troubleshooting) this would be fine. The step up from this criteria is reproduction (which I again allow troubleshooting) - so it would make sense that this is getting it to run, whilst next step is getting sufficiently similar results. May just need to add a caveat that this is with troubleshooting allowed (which may not be journal policy) - in same way that caveat, this is my interpretation of badges from available information and cannot guarantee would or would not be awarded.
- Chat with Tom: Fine to just caveat.
hour
: Reproduced within approximately one hour (excluding compute time)- Took longer than an hour, but I wasn’t trying to get it done in that time
- If I hadn’t spent time reading and documenting and fiddling with the seeds, then I anticipate I could’ve run it within an hour
- However, I’m assuming to follow our process and fail it (for consistency with how we are working and timing)
- Chat with Tom: Fine to just caveat.
documentation_readme
: “Artefacts are clearly documented and accompanied by a README file with step-by-step instructions on how to reproduce results in the manuscript”- I wouldn’t say it explicitly meets this criteria
- Although it was simple enough that it could do it anyway - directed me to notebook, run that, job done.
- Uncertain on this one
- Uncertainty is fine - just make a choice, and justify and document that choice in the logbook
Evaluation against reporting guidelines (STRESS-DES pt. 1)
These were evaluated based ONLY on the journal article and supplementary materials (and not on the linked code repository).
Started work on evaluating against STRESS-DES.
Tom mentioned how this study they provided the filled out STRESS checklist to the journal, but it wasn’t published alongside the article, but that would have been beneficial. This is a good suggestion to think about.
Also, it has proven quite time consuming to evaluate against the checklist - hence, moreso reason to include it with the article, to make it easier for readers to spot these things.
Suggested changes for protocol/template
✅ = Made the change.
💬 = Noted to discuss with team.
Protocol:
- ✅ Suggest that uncertainties on whether a badge criteria is met or not could be discussed within the STARS team. Suggest to detail those uncertainties within logbook
- ✅ If there are multiple code locations (e.g. code repository and archive), then refer to both when assessing against criteria
- ✅ Recommended sources for evaluation (e.g. reporting guidelines is based on article, badges and sharing of research artefacts are based on code)
- 💬 Given how long it is taking, would it be relevant to time these steps?