Cargando…

Real-time intraoperative glioma diagnosis using fluorescence imaging and deep convolutional neural networks

PURPOSE: Surgery is the predominant treatment modality of human glioma but suffers difficulty on clearly identifying tumor boundaries in clinic. Conventional practice involves neurosurgeon’s visual evaluation and intraoperative histological examination of dissected tissues using frozen section, whic...

Descripción completa

Detalles Bibliográficos
Autores principales: Shen, Biluo, Zhang, Zhe, Shi, Xiaojing, Cao, Caiguang, Zhang, Zeyu, Hu, Zhenhua, Ji, Nan, Tian, Jie
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Berlin Heidelberg 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8440289/
https://www.ncbi.nlm.nih.gov/pubmed/33904984
http://dx.doi.org/10.1007/s00259-021-05326-y
Descripción
Sumario:PURPOSE: Surgery is the predominant treatment modality of human glioma but suffers difficulty on clearly identifying tumor boundaries in clinic. Conventional practice involves neurosurgeon’s visual evaluation and intraoperative histological examination of dissected tissues using frozen section, which is time-consuming and complex. The aim of this study was to develop fluorescent imaging coupled with artificial intelligence technique to quickly and accurately determine glioma in real-time during surgery. METHODS: Glioma patients (N = 23) were enrolled and injected with indocyanine green for fluorescence image–guided surgery. Tissue samples (N = 1874) were harvested from surgery of these patients, and the second near-infrared window (NIR-II, 1000–1700 nm) fluorescence images were obtained. Deep convolutional neural networks (CNNs) combined with NIR-II fluorescence imaging (named as FL-CNN) were explored to automatically provide pathological diagnosis of glioma in situ in real-time during patient surgery. The pathological examination results were used as the gold standard. RESULTS: The developed FL-CNN achieved the area under the curve (AUC) of 0.945. Comparing to neurosurgeons’ judgment, with the same level of specificity >80%, FL-CNN achieved a much higher sensitivity (93.8% versus 82.0%, P < 0.001) with zero time overhead. Further experiments demonstrated that FL-CNN corrected >70% of the errors made by neurosurgeons. FL-CNN was also able to rapidly predict grade and Ki-67 level (AUC 0.810 and 0.625) of tumor specimens intraoperatively. CONCLUSION: Our study demonstrates that deep CNNs are better at capturing important information from fluorescence images than surgeons’ evaluation during patient surgery. FL-CNN is highly promising to provide pathological diagnosis intraoperatively and assist neurosurgeons to obtain maximum resection safely. TRIAL REGISTRATION: ChiCTR ChiCTR2000029402. Registered 29 January 2020, retrospectively registered SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00259-021-05326-y.