Cargando…

Reliability of Dutch Obstetric Telephone Triage

BACKGROUND: Safety and efficiency of emergency care can be optimized with a triage system which uses urgency to prioritize care. The Dutch Obstetric Telephone Triage System (DOTTS) was developed to provide a basis for assessing urgency of unplanned obstetric care requests by telephone. Reliability a...

Descripción completa

Detalles Bibliográficos
Autores principales: Engeltjes, Bernice, Rosman, Ageeth, Bertens, Loes C M, Wouters, Eveline, Cronie, Doug, Scheele, Fedde
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Dove 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8357617/
https://www.ncbi.nlm.nih.gov/pubmed/34393531
http://dx.doi.org/10.2147/RMHP.S319564
Descripción
Sumario:BACKGROUND: Safety and efficiency of emergency care can be optimized with a triage system which uses urgency to prioritize care. The Dutch Obstetric Telephone Triage System (DOTTS) was developed to provide a basis for assessing urgency of unplanned obstetric care requests by telephone. Reliability and validity are important components in evaluating such (obstetric) triage systems. OBJECTIVE: To determine the reliability of Dutch Obstetric Telephone Triage, by calculating the inter-rater and intra-rater reliability. METHODS: To evaluate the urgency levels of DOTTS by testing inter-rater and intra-rater reliability, 90 vignettes of possible requests were developed. Nineteen participants, from hospitals where DOTTS had been implemented, rated in two rounds a set of ten vignettes. The five urgency levels and five presenting symptoms had an equal spread and had to be entered in accordance with DOTTS per vignette. Urgency levels were dichotomized into high urgency and intermediate urgency. Inter-rater reliability was rated as degree of agreement between two different participants with the same vignette. Intra-rater reliability was rated as agreement by the same participants at different moments in time. The degree of inter-rater and intra-rater reliability was tested using weighted Cohen’s Kappa and ICC. RESULTS: The agreement of urgency level between participants in accordance with predefined urgency level per vignette was 90.5% (95% CI 87.5–93.6) [335 of 370]. Agreement of urgency level between participants was 88.5% (95% CI 84.9–93.0) [177 of 200] and 84.9% (95% CI 78.3–91.4) after re-rating [101 of 119]. Inter-rater reliability of DOTTS expressed as Cohen’s Kappa was 0.77 and as ICC 0.87; intra-rater reliability of DOTTS expressed as Cohen’s Kappa was 0.70 and as ICC 0.82. CONCLUSION: Inter-rater and intra-rater reliability of DOTTS showed substantial correlation, and is comparable to other studies. Therefore, DOTTS is considered reliable.