Cargando…

‘Validation of ultrasound examinations performed by general practitioners’

OBJECTIVE: The aim of this study was to evaluate the diagnostic agreement when a general practitioner and subsequently a specialist (radiologist/gynecologist) performed point-of-care ultrasound examinations for certain abdominal and gynecological conditions of low to moderate complexity. DESIGN: A p...

Descripción completa

Detalles Bibliográficos
Autores principales: Lindgaard, Karsten, Riisgaard, Lars
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Taylor & Francis 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5592352/
https://www.ncbi.nlm.nih.gov/pubmed/28776457
http://dx.doi.org/10.1080/02813432.2017.1358437
Descripción
Sumario:OBJECTIVE: The aim of this study was to evaluate the diagnostic agreement when a general practitioner and subsequently a specialist (radiologist/gynecologist) performed point-of-care ultrasound examinations for certain abdominal and gynecological conditions of low to moderate complexity. DESIGN: A prospective study of inter-rater reliability and agreement. SETTING: Patients were recruited and initially scanned in general practice. The validation examinations were conducted in a hospital setting. SUBJECTS: A convenient sample of 114 patients presenting with abdominal pain or discomfort, possible pregnancy or known risk factors toward abdominal aortic aneurism were included. MAIN OUTCOME MEASURES: Inter-rater agreement (Kappa statistic and percentage agreement) between ultrasound examinations by general practitioner and specialist for the following conditions: gallstones, ascites, abdominal aorta >5 cm, intrauterine pregnancy and gestational age. RESULTS: An overall Kappa value of 0.93 (95% confidence interval (CI): 0.87–0.98) was obtained. Ascites, abdominal aortic diameter >5cm, and intrauterine pregnancy showed Kappa values of 1. CONCLUSION: Our study showed that general practitioners performing point-of-care ultrasound examinations with low-to-moderate complexity had a very high rate of inter-rater agreement compared with specialists.