Cargando…

Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study

BACKGROUND: Different software programs are available for the evaluation of 4D Flow cardiovascular magnetic resonance (CMR). A good agreement of the results between programs is a prerequisite for the acceptance of the method. Therefore, the goal was to compare quantitative results from a cross-over...

Descripción completa

Detalles Bibliográficos
Autores principales: Oechtering, Thekla H., Nowak, André, Sieren, Malte M., Stroth, Andreas M., Kirschke, Nicolas, Wegner, Franz, Balks, Maren, König, Inke R., Jin, Ning, Graessner, Joachim, Kooijman-Kurfuerst, Hendrik, Hennemuth, Anja, Barkhausen, Jörg, Frydrychowicz, Alex
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10052852/
https://www.ncbi.nlm.nih.gov/pubmed/36978131
http://dx.doi.org/10.1186/s12968-023-00921-4
_version_ 1785015253521661952
author Oechtering, Thekla H.
Nowak, André
Sieren, Malte M.
Stroth, Andreas M.
Kirschke, Nicolas
Wegner, Franz
Balks, Maren
König, Inke R.
Jin, Ning
Graessner, Joachim
Kooijman-Kurfuerst, Hendrik
Hennemuth, Anja
Barkhausen, Jörg
Frydrychowicz, Alex
author_facet Oechtering, Thekla H.
Nowak, André
Sieren, Malte M.
Stroth, Andreas M.
Kirschke, Nicolas
Wegner, Franz
Balks, Maren
König, Inke R.
Jin, Ning
Graessner, Joachim
Kooijman-Kurfuerst, Hendrik
Hennemuth, Anja
Barkhausen, Jörg
Frydrychowicz, Alex
author_sort Oechtering, Thekla H.
collection PubMed
description BACKGROUND: Different software programs are available for the evaluation of 4D Flow cardiovascular magnetic resonance (CMR). A good agreement of the results between programs is a prerequisite for the acceptance of the method. Therefore, the goal was to compare quantitative results from a cross-over comparison in individuals examined on two scanners of different vendors analyzed with four postprocessing software packages. METHODS: Eight healthy subjects (27 ± 3 years, 3 women) were each examined on two 3T CMR systems (Ingenia, Philips Healthcare; MAGNETOM Skyra, Siemens Healthineers) with a standardized 4D Flow CMR sequence. Six manually placed aortic contours were evaluated with Caas (Pie Medical Imaging, SW-A), cvi42 (Circle Cardiovascular Imaging, SW-B), GTFlow (GyroTools, SW-C), and MevisFlow (Fraunhofer Institute MEVIS, SW-D) to analyze seven clinically used parameters including stroke volume, peak flow, peak velocity, and area as well as typically scientifically used wall shear stress values. Statistical analysis of inter- and intrareader variability, inter-software and inter-scanner comparison included calculation of absolute and relative error (E(R)), intraclass correlation coefficient (ICC), Bland–Altman analysis, and equivalence testing based on the assumption that inter-software differences needed to be within 80% of the range of intrareader differences. RESULTS: SW-A and SW-C were the only software programs showing agreement for stroke volume (ICC = 0.96; E(R) = 3 ± 8%), peak flow (ICC: 0.97; E(R) = −1 ± 7%), and area (ICC = 0.81; E(R) = 2 ± 22%). Results from SW-A/D and SW-C/D were equivalent only for area and peak flow. Other software pairs did not yield equivalent results for routinely used clinical parameters. Especially peak maximum velocity yielded poor agreement (ICC ≤ 0.4) between all software packages except SW-A/D that showed good agreement (ICC = 0.80). Inter- and intrareader consistency for clinically used parameters was best for SW-A and SW-D (ICC = 0.56–97) and worst for SW-B (ICC = -0.01–0.71). Of note, inter-scanner differences per individual tended to be smaller than inter-software differences. CONCLUSIONS: Of all tested software programs, only SW-A and SW-C can be used equivalently for determination of stroke volume, peak flow, and vessel area. Irrespective of the applied software and scanner, high intra- and interreader variability for all parameters have to be taken into account before introducing 4D Flow CMR in clinical routine. Especially in multicenter clinical trials a single image evaluation software should be applied.
format Online
Article
Text
id pubmed-10052852
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-100528522023-03-30 Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study Oechtering, Thekla H. Nowak, André Sieren, Malte M. Stroth, Andreas M. Kirschke, Nicolas Wegner, Franz Balks, Maren König, Inke R. Jin, Ning Graessner, Joachim Kooijman-Kurfuerst, Hendrik Hennemuth, Anja Barkhausen, Jörg Frydrychowicz, Alex J Cardiovasc Magn Reson Research BACKGROUND: Different software programs are available for the evaluation of 4D Flow cardiovascular magnetic resonance (CMR). A good agreement of the results between programs is a prerequisite for the acceptance of the method. Therefore, the goal was to compare quantitative results from a cross-over comparison in individuals examined on two scanners of different vendors analyzed with four postprocessing software packages. METHODS: Eight healthy subjects (27 ± 3 years, 3 women) were each examined on two 3T CMR systems (Ingenia, Philips Healthcare; MAGNETOM Skyra, Siemens Healthineers) with a standardized 4D Flow CMR sequence. Six manually placed aortic contours were evaluated with Caas (Pie Medical Imaging, SW-A), cvi42 (Circle Cardiovascular Imaging, SW-B), GTFlow (GyroTools, SW-C), and MevisFlow (Fraunhofer Institute MEVIS, SW-D) to analyze seven clinically used parameters including stroke volume, peak flow, peak velocity, and area as well as typically scientifically used wall shear stress values. Statistical analysis of inter- and intrareader variability, inter-software and inter-scanner comparison included calculation of absolute and relative error (E(R)), intraclass correlation coefficient (ICC), Bland–Altman analysis, and equivalence testing based on the assumption that inter-software differences needed to be within 80% of the range of intrareader differences. RESULTS: SW-A and SW-C were the only software programs showing agreement for stroke volume (ICC = 0.96; E(R) = 3 ± 8%), peak flow (ICC: 0.97; E(R) = −1 ± 7%), and area (ICC = 0.81; E(R) = 2 ± 22%). Results from SW-A/D and SW-C/D were equivalent only for area and peak flow. Other software pairs did not yield equivalent results for routinely used clinical parameters. Especially peak maximum velocity yielded poor agreement (ICC ≤ 0.4) between all software packages except SW-A/D that showed good agreement (ICC = 0.80). Inter- and intrareader consistency for clinically used parameters was best for SW-A and SW-D (ICC = 0.56–97) and worst for SW-B (ICC = -0.01–0.71). Of note, inter-scanner differences per individual tended to be smaller than inter-software differences. CONCLUSIONS: Of all tested software programs, only SW-A and SW-C can be used equivalently for determination of stroke volume, peak flow, and vessel area. Irrespective of the applied software and scanner, high intra- and interreader variability for all parameters have to be taken into account before introducing 4D Flow CMR in clinical routine. Especially in multicenter clinical trials a single image evaluation software should be applied. BioMed Central 2023-03-28 /pmc/articles/PMC10052852/ /pubmed/36978131 http://dx.doi.org/10.1186/s12968-023-00921-4 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Oechtering, Thekla H.
Nowak, André
Sieren, Malte M.
Stroth, Andreas M.
Kirschke, Nicolas
Wegner, Franz
Balks, Maren
König, Inke R.
Jin, Ning
Graessner, Joachim
Kooijman-Kurfuerst, Hendrik
Hennemuth, Anja
Barkhausen, Jörg
Frydrychowicz, Alex
Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study
title Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study
title_full Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study
title_fullStr Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study
title_full_unstemmed Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study
title_short Repeatability and reproducibility of various 4D Flow MRI postprocessing software programs in a multi-software and multi-vendor cross-over comparison study
title_sort repeatability and reproducibility of various 4d flow mri postprocessing software programs in a multi-software and multi-vendor cross-over comparison study
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10052852/
https://www.ncbi.nlm.nih.gov/pubmed/36978131
http://dx.doi.org/10.1186/s12968-023-00921-4
work_keys_str_mv AT oechteringtheklah repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT nowakandre repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT sierenmaltem repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT strothandreasm repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT kirschkenicolas repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT wegnerfranz repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT balksmaren repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT koniginker repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT jinning repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT graessnerjoachim repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT kooijmankurfuersthendrik repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT hennemuthanja repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT barkhausenjorg repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy
AT frydrychowiczalex repeatabilityandreproducibilityofvarious4dflowmripostprocessingsoftwareprogramsinamultisoftwareandmultivendorcrossovercomparisonstudy