Cargando…

Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records

The extraction of patient signs and symptoms recorded as free text in electronic health records is critical for precision medicine. Once extracted, signs and symptoms can be made computable by mapping to signs and symptoms in an ontology. Extracting signs and symptoms from free text is tedious and t...

Descripción completa

Detalles Bibliográficos
Autores principales: Oommen, Chelsea, Howlett-Prieto, Quentin, Carrithers, Michael D., Hier, Daniel B.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10294690/
https://www.ncbi.nlm.nih.gov/pubmed/37383943
http://dx.doi.org/10.3389/fdgth.2023.1075771
_version_ 1785063244134612992
author Oommen, Chelsea
Howlett-Prieto, Quentin
Carrithers, Michael D.
Hier, Daniel B.
author_facet Oommen, Chelsea
Howlett-Prieto, Quentin
Carrithers, Michael D.
Hier, Daniel B.
author_sort Oommen, Chelsea
collection PubMed
description The extraction of patient signs and symptoms recorded as free text in electronic health records is critical for precision medicine. Once extracted, signs and symptoms can be made computable by mapping to signs and symptoms in an ontology. Extracting signs and symptoms from free text is tedious and time-consuming. Prior studies have suggested that inter-rater agreement for clinical concept extraction is low. We have examined inter-rater agreement for annotating neurologic concepts in clinical notes from electronic health records. After training on the annotation process, the annotation tool, and the supporting neuro-ontology, three raters annotated 15 clinical notes in three rounds. Inter-rater agreement between the three annotators was high for text span and category label. A machine annotator based on a convolutional neural network had a high level of agreement with the human annotators but one that was lower than human inter-rater agreement. We conclude that high levels of agreement between human annotators are possible with appropriate training and annotation tools. Furthermore, more training examples combined with improvements in neural networks and natural language processing should make machine annotators capable of high throughput automated clinical concept extraction with high levels of agreement with human annotators.
format Online
Article
Text
id pubmed-10294690
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-102946902023-06-28 Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records Oommen, Chelsea Howlett-Prieto, Quentin Carrithers, Michael D. Hier, Daniel B. Front Digit Health Digital Health The extraction of patient signs and symptoms recorded as free text in electronic health records is critical for precision medicine. Once extracted, signs and symptoms can be made computable by mapping to signs and symptoms in an ontology. Extracting signs and symptoms from free text is tedious and time-consuming. Prior studies have suggested that inter-rater agreement for clinical concept extraction is low. We have examined inter-rater agreement for annotating neurologic concepts in clinical notes from electronic health records. After training on the annotation process, the annotation tool, and the supporting neuro-ontology, three raters annotated 15 clinical notes in three rounds. Inter-rater agreement between the three annotators was high for text span and category label. A machine annotator based on a convolutional neural network had a high level of agreement with the human annotators but one that was lower than human inter-rater agreement. We conclude that high levels of agreement between human annotators are possible with appropriate training and annotation tools. Furthermore, more training examples combined with improvements in neural networks and natural language processing should make machine annotators capable of high throughput automated clinical concept extraction with high levels of agreement with human annotators. Frontiers Media S.A. 2023-06-13 /pmc/articles/PMC10294690/ /pubmed/37383943 http://dx.doi.org/10.3389/fdgth.2023.1075771 Text en © 2023 Oommen, Howlett-Prieto, Carrithers and Hier. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY) (https://creativecommons.org/licenses/by/4.0/) . The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Digital Health
Oommen, Chelsea
Howlett-Prieto, Quentin
Carrithers, Michael D.
Hier, Daniel B.
Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records
title Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records
title_full Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records
title_fullStr Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records
title_full_unstemmed Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records
title_short Inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records
title_sort inter-rater agreement for the annotation of neurologic signs and symptoms in electronic health records
topic Digital Health
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10294690/
https://www.ncbi.nlm.nih.gov/pubmed/37383943
http://dx.doi.org/10.3389/fdgth.2023.1075771
work_keys_str_mv AT oommenchelsea interrateragreementfortheannotationofneurologicsignsandsymptomsinelectronichealthrecords
AT howlettprietoquentin interrateragreementfortheannotationofneurologicsignsandsymptomsinelectronichealthrecords
AT carrithersmichaeld interrateragreementfortheannotationofneurologicsignsandsymptomsinelectronichealthrecords
AT hierdanielb interrateragreementfortheannotationofneurologicsignsandsymptomsinelectronichealthrecords