Enhancing the QUAlity and Transparency Of health Research
Reporting guidelines under development for other study designs or clinical areas:
(year of registration in brackets)
Checklist for the conduct and reporting of micro-costing studies in health care (registered 9 June 2014)
A protocol for this guideline was published in 2016: PMID 27707687, full text here.
In April 2019, the group informed EQUATOR that the reporting guideline was still under development.
Designing Economic Evaluation Protocols (DEEP) to adhere to CHEERS (registered 7 October 2016)
A guide to designing economic evaluations to adhere to the CHEERS reporting guideline.
Guideline for reporting the long-term impacts of genocide and war (GESUQ) (registered 1 August 2016)
This guideline is being developed to help researchers to report the long-term impact of genocides, wars and mass atrocities on mental health.
Contact: Dr Jutta Lindert, firstname.lastname@example.org
The guideline contains international consensus on the minimum items that need to be included in a forensic medical expert opinion and how they should be reported.
Read the study protocol
CRIS statement is a checklist for uniform reporting across in-vitro studies involving dental materials. CRIS will also help in conducting effective systematic reviews and meta-analyses of specific topics, thus bringing out the best evidence. It includes sections that were not previously reported in in-vitro studies such as sample size calculation, sample preparation and handling, randomisation and blinded assessment and improved reporting of statistical methods and results. This checklist aims to promote transparency, reproducibility, validity and completeness while reporting in-vitro studies involving dental materials.
The group finished compiling the Delphi results in 2019.
A website has been created, with the description of the core group, the chronology of activities and publications, here: www.cris-statement.org.
Read the concept note: Krithikadatta J, Gopikrishna V, Datta M. CRIS Guidelines (Checklist for Reporting In-vitro Studies): A concept note on the need for standardized guidelines for improving quality and transparency in reporting in-vitro studies in experimental dental research. J Conserv Dent. 2014;17(4):301-304. PMID: 25125839
DEVELOPTOOLS (registered 21 June 2017)
This reporting guideline addresses the development of tools such as medical devices, physical assistive devices, eHealth and mHealth applications, patient decision aids, and other tools developed for patients to use.
Read the protocol for the systematic review on which this reporting guideline is based.
Intervention Development Reporting Guideline (INDEX) (registered 5 July 2017)
The Index reporting guideline aims to improve the quality and completeness of reporting different approaches to intervention development.
Reporting Items for Public Version of Guidelines: RIGHT-PVG (registered 1 November 2017)
The objectives of this reporting guideline development are: 1) To identify and describe currently published literature on the reporting of patient versions of guidelines; 2) To develop essential reporting items for the public, including patient, versions of guidelines for healthcare; 3) To identify the characteristics of high-quality PVGs.
Read the study protocol
Reporting recommendations for psychometric investigations of patient-reported outcome measures (registered 4 April 2017)
This guideline will cover all studies exploring any psychometric property of a patient-reported outcome measure.
DevelOpment of Curricula: Tools for Reporting INnovations in Education (DOCTRINE) (registered 16 March 2018)
The group is developing a reporting checklist for curriculum development in health professions education. The purpose of this checklist is to ensure that prospective authors report all essential elements of an educational project to allow future readers to replicate their work. The group’s review of other reporting checklists in the EQUATOR network database showed them that some checklists are specific to certain content areas (such as evidence-based practice or team-based learning), whereas this new checklist would be generalisable to any health professions educator describing their curriculum as an educational innovation.
Consensus Reporting Items for Research in Primary Care (CRISP) (registered 28 August 2018)
The CRISP working group is an international, interprofessional, interdisciplinary team working to improve the reporting of primary care research. Globally, there are no guidelines on the reporting of primary care research. There are specific factors related to primary care research and research translation that warrant guidance in reporting. We will perform a needs assessment to identify common and important areas for improvement in reporting of primary care research. Based on this review, we will propose either extensions of existing guidelines (one or more) or a new guideline. We plan to use a transparent, explicit, iterative group process to develop guidelines that will be useful across the many methods, populations and settings where primary care research is performed, reported and applied.
CHecklist for conversation Analysis reporTing (CHAT) (registered 29 August 2018)
The scope of this checklist is to enable researchers to assess the quality of conversation analytic studies. It can be used to:
(1) assess the quality of reporting, therefore facilitating systematic review and synthesis
(2) provide clear guidelines for authors to ensure that results are presented in a standardised way which maximises their external applicability
This checklist is being developed via the Delphi Consensus process, in collaboration with experts in the field of conversation analysis.
Publications reporting on health interventions or other health research that involves human-centred design or design thinking often omit detailed information related to methodology, ethical considerations, and evaluation details. These omissions make it difficult to assess, review, and catalogue such research across disciplines of health and biomedicine. This dearth of information in a research area that is increasingly funded by public agencies, and carried out by and with health stakeholders, presents a problem in that it hinders the ability for building an evidence base, inhibits the opportunity to diffuse innovations, and reduces the dissemination of critically important information back to the public. Many issues have resulted in this absence, including barriers of disseminating interdisciplinary research in public health and biomedical journals, lack of incentive for organizations conducting such work to publish it in the scholarly literature (many of whom exist in the private or non-academic sectors), unfamiliarity of the wider health community with the process of research grounded in design, and a lack of rigorous guidance appropriate for scholarly work in this field. The proposed draft set of guidelines is intended to support researchers and practitioners with reporting on the planning, writing, reviewing, and interpreting of research that has used a human-centred design or design thinking. A draft checklist is intended to serve as a jumping-off point for collaborative development of conclusive guidelines that would represent a thorough overview of consensus-based best-practices for utilising design in research to improve health outcomes. Without more frequent reporting and documenting of transparent and evaluable design-based practices in scholarly literature, such research will not fulfil its potential to complement existing approaches and take further the goal of global health equity. The draft guidance would provide an initial step in a consensus-building process for the development of a more definitive set of guidelines for conduct and review of health research using design.
Background: Bazzano AN, Martin J, Hicks E, Faughnan M, Murphy L. Human-centred design in global health: A scoping review of applications and contexts. PLoS One. 2017;12(11):e0186744. PMID: 29091935
BIAS: Transparent reporting of biomedical image analysis challenges (registered 23 January 2019)
The number of biomedical image analysis challenges organised per year is steadily increasing. These international competitions have the purpose of benchmarking algorithms on common data sets, typically to identify the best method for a given problem. Recent research, however, revealed “that common practice related to challenge reporting is poor and does not allow for adequate interpretation and reproducibility of results”. To address the discrepancy between the impact of challenges and the quality (control), the Biomedical Image Analysis ChallengeS (BIAS) initiative is developing a set of recommendations for the reporting of challenges. The BIAS statement aims to improve the transparency of the reporting of a biomedical image analysis challenge regardless of the field of application, image modality or algorithm category assessed.
Read a background paper in Nature Communications presenting a critical analysis of common research practice related to the design, reporting and execution of biomedical image analysis challenges.
Reporting Guidelines for Public Health Non-Communicable Diseases Modelling Studies (registered on 28 January 2019)
Public health non-communicable diseases (NCD) modelling studies are concerned with quantifying the health impact of changes in risk factors for NCDs without directly measuring health impact. They include both health impacts models and futures models. Health impact modelling estimates the impact of changes in risk factors that relate to scenarios (e.g. policy goals), arise from interventions (e.g. policies, regulatory initiatives) or arise from public health preventative programmes (either single or multi-component programmes). The interventions or programmes being studied operate by affecting wider socio-economic determinants, environmental, behavioural or medical risk factors for NCDs. Futures models estimate the future burden of disease based on changes in risk factors. These studies include, as a direct or intermediate input to the model, changing one or more risk factors for one or more NCDs within a population according to a specified scenario or intervention. They model the change in exposures through to at least one NCD specific health outcome and/or to a general measure of health outcome (e.g. quality-adjusted life years, all-cause mortality).
The Reporting Guidelines for Public Health Non-Communicable Diseases Modelling Studies should apply to all public health models concerned with quantifying the relationship between a risk factor and health (i.e. health impact models or futures models). Models that focus on describing disease transmission (e.g. infectious disease modelling, social contagion) will be out of scope. The World Health Organization (WHO) definition of non-communicable diseases, which recognises the five major NCDs are cardiovascular diseases, diabetes, chronic respiratory diseases, cancer and mental health conditions, will be used in the development of this guideline.
The group is producing these guidelines, in part, in response to a call by the Medical Research Council in the UK. An initial consultation was taken to agree on the scope of the guidelines and establish a network of participants to support their development globally. The group will follow the five-step process outlined by EQUATOR: literature review; online consensus Delphi; expert consensus meeting (face-to-face); writing of guidelines by a small committee; dissemination.
The group is currently seeking funding to undertake the full work and plans to start the systematic review in September 2019, with a view to publishing the guidelines in 2021. They plan to publish this computer-based modelling reporting guideline, as an open-access document, in the Summer/Autumn of 2021.
Updates will be published at http://www.mrc-epid.cam.ac.uk/ph-modelling-guidelines/
METRIC – METhodological Review reportIng Checklist – guidelines for reporting methodological reviews in health research (registered 25 February 2019)
Methodological reviews (MR) are an efficient way of assessing research methods and summarising methodological issues in the conduct and reporting of health research. As part of the ongoing development of this concept and the most appropriate nomenclature, the group has established a working definition for MRs as “studies that appraise the design, conduct, analysis and reporting of other studies”. As such, MRs are seen as highly informative because they allow researchers to generate empirical evidence on the quality, completeness and accuracy of reporting; document the variety of methods used in health research studies; investigate adherence to guidelines; assess approaches to analyses; demonstrate changes in reporting over time; determine consistency between abstracts/trial registries and published articles, and many other issues. Despite an increase in their development and usage, there is limited published guidance on the conduct and reporting of MRs. The current body of literature shows wide inconsistencies in MR nomenclature and methodology. The aim of this guideline is to standardise the nomenclature and reporting of MRs in the context of health research. Based on consensus by expert stakeholders, this guideline and accompanying statement will highlight the necessary methodological features that should be included for designing and reporting high-quality MRs.
The development of this guideline started in November 2017 as follows:
Phase I – Literature Review: Conduct a full scoping review to identify and describe MRs, and report on current methods, summarise the variety of questions being addressed, and develop a preliminary conceptual framework for MRs;
Phase II – Delphi Study & Face-to-Face Consensus Meeting: Establish a working group of expert stakeholders (e.g. methodologists, epidemiologists, biostatisticians, journals, guideline groups) to validate and refine the conceptual framework for MR nomenclature and methodology;
Phase III – Publication and Dissemination: Incorporate expert feedback from the consensus meeting to finalise the MR guideline as a checklist. This document will include the consensus statement and outline standardised criteria for appropriate MR terminology, methodology and reporting. The group intends to publish the reporting guideline as an open-access document in May 2021.
Reporting of Artificial Intelligence and Machine Learning Studies (registered 7 May 2019)
This guideline will be an extension to the TRIPOD Statement for prediction model studies (including imaging studies) using artificial intelligence or machine learning.
An announcement marking this initiative was published in The Lancet.
Reporting Recommendations for Research on Human Tissues (REPORT) (registered 17 June 2019)
From a literature review, the authors identified the need for a simple guideline to improve the reporting of studies using human tissue, especially in cardiology (aortic tissue). They aim to address aspects of reproducibility and data interpretation in the new reporting guideline, especially regarding a clear description of the cohort used, the source and site of tissue sampling and sample division, patient metadata, availability of raw and processed data and study limitations.
The multidisciplinary team involved in the reporting guideline development includes computational biologists, surgeons, engineers, biobank managers, and basic scientists. They plan a two-day symposium to happen on the Summer of 2020 in Liverpool to discuss and develop preliminary guidelines, to be collated and circulated amongst the group for feedback, revisions and iterations.
The group plans to publish the reporting guideline as an open-access document by the end of 2020.
Working group: Dr Hannah Davies, Dr Jill Madine, Dr Riaz Akhtar, Mr Mark Field, Dr Hannah Levis, Dr Vijay Sharma, Dr Eva Caamaño-Gutiérrez, Dr Marie Phelan, Prof. Robert Moots, Dr Nicola Tempest
Research of protocols and results of interventions involving training of healthcare staff need to be reported in transparent and reproducible ways. Equivocation in evidence or weak evidence as to whether complex team training interventions ultimately affect patient outcomes. Reproducibility and transparency of scientific reports are needed in order to enhance the understanding of effective design, deployment and evaluation and to inform wise decisions on investing in such interventions. The group intends to produce a checklist for reporting complex multi-professional healthcare teamwork training and plans to publish this checklist in the Fall of 2019. Then, in a second phase, an expert panel will evaluate the checklist. The adjusted checklist and the explanation and elaboration document are planned to be published in the Fall of 2020. The checklist will encompass any study design describing healthcare professionals training interventions.
Reporting guideline for health research priority setting with stakeholders (REPRISE) (registered 6 August 2019)
The group of researchers from Australia, United Kingdom, South Africa and India has compiled a checklist with 31 items that cover 10 domains of reporting health research priority setting studies: context and scope, governance and team, framework for priority setting, stakeholders/participants, identification and collection of priorities, prioritisation of research topics, output, evaluation and feedback, translation and implementation, and funding and conflict of interest. Each reporting item includes a descriptor and examples. They intend to publish the reporting guideline in August 2020, as an open-access document.
Guidelines for Large-Scale, Applied Clinical Informatics Research (GLACIeR) (registered 3 October 2019)
Organisations adopting various commercial health information technology (HIT) applications choose different configurations, use systems in distinct ways, and learn from mistakes during such processes. Most studies that do get published lack details on the implementation context, which prevent healthcare organisations and HIT producers from adopting best practices. A research group from Vanderbilt University Medical Center plans to develop a guideline for the reporting on collaborative applied clinical informatics projects across multiple institutions and HIT applications, such as electronic health records, computerised provider order entry systems, or clinical decision support systems.
The group plans to perform a systematic literature review, including reporting guidelines and has applied for funding. The guideline is expected to be published in 2023, as an open-access document.
Extension for the RIGHT statement for Reporting Adapted Practice Guidelines in Health Care: the RIGHT-Ad@pt Checklist (registered 30 October 2019)
RIGHT-Ad@pt is an extension of the RIGHT reporting guideline, which has been developed to help authors improve the completeness and reporting quality of healthcare practice guidelines. The RIGHT-Ad@pt will take care of adapted clinical guidelines, i.e., those that are not produced de novo, but developed as adaptations of existing guidelines. This is frequent in low- and middle-income scenarios, with fewer resources to develop high quality de novo guidelines.
The working group plans to use Delphi consensus methods to develop the checklist, and to publish the reporting guideline as an open-access document around December 2020. The protocol for this reporting guideline development has been published in BMJ Open.
Reporting guidelines for Whole Body Vibration studies in humans, animals and cell cultures (registered 16 December 2019)
A group of experts from several nations gathered to discuss the development or updating of reporting guidelines for studies on whole-body vibration, a therapeutic modality for the improvement of neuromuscular performance, in which subjects are exposed to vibrations through a vibrating platform. A reporting guideline was published about this in 2010. However, the group feels that it needs to be updated and expanded, with new outcomes and a better description of the parameters used (such as direction, frequency, magnitude and duration of vibrations) and items on animal and cell cultures studies and study protocols.
The Executive Group has 13 participants from several professional backgrounds. The experts have met twice to review the literature and reach an initial set of items for the reporting guideline. Three rounds of Delphi consensus were completed with 51 respondents. Another meeting is planned for June 2020 and the publication of the reporting guideline, as an open-access paper, is planned for 2021.
Contact: Dr. Marieke van Heuvelen. E-mail: email@example.com
Standards for reporting qualitative research: extension for multi-centre, multinational and multi-language studies (EX-QUAL) (registered 18 December 2019)
A wide range of stakeholders with differing areas of expertise in qualitative research in multi-centre, multinational and/or multi-language studies, including invited members of the development team of the original reporting guidelines, aim to develop an extension of the Standards for reporting qualitative research (SRQR) and the Consolidated criteria for reporting qualitative research (COREQ) for reporting qualitative multi-centre, multinational and multi-language studies.
The group believes that the existing standards/criteria for reporting qualitative research (SRQR and COREQ) are excellent guidelines increasingly used, but are not specific enough to sufficiently cover multi-centre, multinational and multi-language issues in qualitative studies. Transparent and standardized reporting of qualitative research results from studies being conducted in several countries including various languages will increase methodological quality and publication priority.
They plan to write a protocol, build an international task force including the developers of SRQR and COREQ, conduct a systematic review, carry out Delphi exercises and a consensus meeting. The group plans to publish the new extension in 2021 as an open-access document.
STARD-AI Extension: Reporting Guidelines for Diagnostic Accuracy Studies Evaluating Artificial Intelligence Interventions (registered 18 December 2019)
The STARD (Standards for Reporting Diagnostic Accuracy) statement is a minimum set of reporting standards for studies which evaluate the diagnostic accuracy of medical interventions, published in 2003, in an attempt to ensure that studies which report diagnostic accuracy are sufficiently informative. However, studies using AI are not adequately covered by the STARD reporting guideline. There are discrepancies in the quality and size of the training datasets employed, and the metrics that are used to report diagnostic performance as well as terminology, so the generalisability and real-world applicability of diagnostic accuracy of studies using AI are hindered.
A group of experts in AI and prognostic modelling are developing an international, multidisciplinary, consensus-based, AI-specific extension to the STARD statement (STARD-AI) that will specifically focus on AI-centric clinical trials which report diagnostic accuracy.
The group has already undertaken a systematic review and gathered a steering group of experts, and now they plan to start an electronic Delphi process to achieve consensus on items that should be included in the STARD-AI extension. They plan to publish the reporting guideline, as an open-access document, in several journals, in the first quarter of 2020.
This reporting guideline is intended to provide guidance on the reporting of the development of competency frameworks in healthcare professions. It will outline the key reporting items that should be reported by those who develop competency frameworks in order to improve the standard of reporting and provide consistency to the reporting process.
The research team developing the reporting guideline has already published a scoping review on the need for competency framework development in healthcare professions.
They will now review the published checklists and guidelines for conducting, reporting and publishing qualitative and mixed-methods studies. An initial pool of items will be evaluated by an expert panel in a modified Delphi process. A pilot test will be conducted on a selection of the articles. The group plans to publish the reporting guideline in early 2021 at the latest, as an open-access document.
No reporting guidelines are available to guide the reporting of domains specific to the speciality of paediatric dentistry. To address this issue, the ‘Reporting stAndards for research in PedIatric Dentistry’ (RAPID) group has been formed.
The group plans to do a literature review, then conduct a Delphi round with around 60 participants (20 academicians, 12 paediatric dentists, 4 epidemiologists, 3 trialists, 3 journal editors, 3 specialists in dental public health, 3 dental practitioners, 3 paediatricians, 3 dental nurses, 3 child patients and 3 parent representatives), using online platform and anonymously until group consensus is achieved. A maximum of 3 rounds will be conducted. The executive group will use the feedback from the RDG to refine the Delphi study. In a third phase, the group will organise a face-to-face consensus meeting with approximately 20 members to finalise the reporting guideline list of items. The guideline will be then piloted among five researchers and five paediatric dentists, and published in an open-access journal, in June 2021, besides being presented at conferences. Feedback after publication will inform future updates.
The protocol of the RAPID reporting guideline has been published:
The reporting guideline website is at www.rapid-statement.org
Page last updated on 14 January 2020
|Clinical practice guidelines||AGREE||RIGHT|
|Animal pre-clinical studies||ARRIVE|
|Quality improvement studies||SQUIRE|