The MIDAS Coordination Center is supporting scientists who have developed a set of guidelines for epidemic forecast reporting, the EPIFORGE checklist.
These guidelines are being submitted for formal journal publication and consideration by the EQUATOR network thereafter.
Feedback and questions are welcome! Please see files below for the paper and supplementary material.
Comments may be sent to firstname.lastname@example.org.
Abstract: The importance of infectious disease epidemic forecasting and prediction research has been underscored by decades of communicable disease outbreaks, including COVID-19. Unlike other fields of medical research, such as clinical trials and systematic reviews, no reporting guidelines exist for reporting epidemic forecasting and prediction research despite their utility. We therefore developed the EPIFORGE checklist, the first known guideline for standardized reporting of epidemic forecasting research. We developed this checklist using a best-practice process for development of reporting guidelines, involving a Delphi process and broad consultation with an international panel of infectious disease modelers and model end-users. The objectives of these guidelines are to improve the consistency, reproducibility, comparability, and quality of epidemic forecasting reporting. The guidelines are not designed to advise scientists on how to perform epidemic forecasting and prediction research, but rather to serve as a standard for reporting critical methodological details of such studies. These guidelines will be submitted to the EQUATOR network, in addition to hosting by other dedicated webpages to facilitate feedback and journal endorsement.
|Section of Manuscript||Checklist item|
|Title / Abstract 1||Study described as a forecast or prediction research in at least the title or abstract|
|Introduction 2||Purpose of study and forecasting targets defined|
|Methods 3||Methods fully documented|
|Methods 4||Identify whether the forecast was performed prospectively, in real-time, and/or retrospectively|
|Methods 5||Origin of input source data explicitly described with reference|
|Methods 6||Source data made available, or reasons why this was not possible documented|
|Methods 7||Input data processing procedures described in detail|
|Methods 8||Statement and description of model type, with model assumptions documented with references.|
|Methods 9||Model code made available, or reasons why this was not possible documented|
|Methods 10||Description of model validation, with justification of approach.|
|Methods 11||Description of forecast accuracy evaluation method, with justification|
|Methods 12||Where possible, compare model results to a benchmark or other comparator model, with justification of comparator choice|
|Methods 13||Description of forecast horizon, and justification of its length|
|Results 14||Uncertainty of forecasting results presented and explained|
|Results 15||Results briefly summarized in lay terms, including a lay interpretation of forecast uncertainty|
|Results 16||If results are published as a data object, encourage a time-stamped version number|
|Discussion 17||Limitations of forecast described, including limitations specific to data quality and methods|
|Discussion 18||If the research is applicable to a specific epidemic, comment on its potential implications and impact for public health action and decision making|
|Discussion 19||If the research is applicable to a specific epidemic, comment on how generalizable it may be across populations|