Cargando…

AxoNet 2.0: A Deep Learning-Based Tool for Morphometric Analysis of Retinal Ganglion Cell Axons

PURPOSE: Assessment of glaucomatous damage in animal models is facilitated by rapid and accurate quantification of retinal ganglion cell (RGC) axonal loss and morphologic change. However, manual assessment is extremely time- and labor-intensive. Here, we developed AxoNet 2.0, an automated deep learn...

Descripción completa

Detalles Bibliográficos
Autores principales: Goyal, Vidisha, Read, A. Thomas, Ritch, Matthew D., Hannon, Bailey G., Rodriguez, Gabriela Sanchez, Brown, Dillon M., Feola, Andrew J., Hedberg-Buenz, Adam, Cull, Grant A., Reynaud, Juan, Garvin, Mona K., Anderson, Michael G., Burgoyne, Claude F., Ethier, C. Ross
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Association for Research in Vision and Ophthalmology 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10020950/
https://www.ncbi.nlm.nih.gov/pubmed/36917117
http://dx.doi.org/10.1167/tvst.12.3.9
Descripción
Sumario:PURPOSE: Assessment of glaucomatous damage in animal models is facilitated by rapid and accurate quantification of retinal ganglion cell (RGC) axonal loss and morphologic change. However, manual assessment is extremely time- and labor-intensive. Here, we developed AxoNet 2.0, an automated deep learning (DL) tool that (i) counts normal-appearing RGC axons and (ii) quantifies their morphometry from light micrographs. METHODS: A DL algorithm was trained to segment the axoplasm and myelin sheath of normal-appearing axons using manually-annotated rat optic nerve (ON) cross-sectional micrographs. Performance was quantified by various metrics (e.g., soft-Dice coefficient between predicted and ground-truth segmentations). We also quantified axon counts, axon density, and axon size distributions between hypertensive and control eyes and compared to literature reports. RESULTS: AxoNet 2.0 performed very well when compared to manual annotations of rat ON (R(2) = 0.92 for automated vs. manual counts, soft-Dice coefficient = 0.81 ± 0.02, mean absolute percentage error in axonal morphometric outcomes < 15%). AxoNet 2.0 also showed promise for generalization, performing well on other animal models (R(2) = 0.97 between automated versus manual counts for mice and 0.98 for non-human primates). As expected, the algorithm detected decreased in axon density in hypertensive rat eyes (P ≪ 0.001) with preferential loss of large axons (P < 0.001). CONCLUSIONS: AxoNet 2.0 provides a fast and nonsubjective tool to quantify both RGC axon counts and morphological features, thus assisting with assessing axonal damage in animal models of glaucomatous optic neuropathy. TRANSLATIONAL RELEVANCE: This deep learning approach will increase rigor of basic science studies designed to investigate RGC axon protection and regeneration.