The formulation of expert-opinion guidelines has several sources of bias that may adversely affect their quality. To minimize bias, guideline creators must employ rigorous methodology. There has been no appraisal of the methodological quality of basic critical care echocardiography (BCCE) training/education guidelines.
What is the methodological quality of expert guidelines/recommendations on BCCE training?
and Methods: The review was performed by a multidisciplinary team including intensive care specialists, hospital scientist, trainee, nurse sonographer and public health expert. Four databases (PubMed, OVID-Embase, Web of Science, Google Scholar) were searched on 31/07/2020 to identify guidelines on BCCE training/education. Every guideline was subjectively assessed for the degree of detail of the recommendations, and objectively assessed using the AGREE-II critical appraisal tool for clinical practice guidelines to generate a scaled domain score. Scores >75%, in every domain was the cut-off for guidelines to be used without modifications.
From 4288 abstracts screened, 24 guidelines met the inclusion criteria. Very few guidelines made clear recommendations regarding introductory courses – physics (n=6; 25%), instrumentation (n=5; 20.8%), image-acquisition theory (n= 6, 25%), course curriculum (n=5, 20.8%), pre/post-course tests (n=1; 4.2%), minimum course duration (n=6, 25%), or trainer-qualifications (n= 5, 20.8%). Very few provided clear recommendations for longitudinal competence programs – clinically indicated scans (n=8, 33.3%), logbook (n= 14; 58.3%), image-storage (n= 9, 37.5%), formative assessment (n=6; 25%), minimum scan numbers (n= 14, 58.3%), image-acquisition competence (n=3, 12.5%), image-interpretation competence (n=2, 8.3%) and credentialling/certification (n=3, 12.5%). Five guidelines (20.8%) attained a scaled overall AGREE-II score >75%. One guideline (4.2%) attained scores of >75% in every domain.

Author