Cargando…
To pretrain or not? A systematic analysis of the benefits of pretraining in diabetic retinopathy
There is an increasing number of medical use cases where classification algorithms based on deep neural networks reach performance levels that are competitive with human medical experts. To alleviate the challenges of small dataset sizes, these systems often rely on pretraining. In this work, we aim...
Autores principales: | Srinivasan, Vignesh, Strodthoff, Nils, Ma, Jackie, Binder, Alexander, Müller, Klaus-Robert, Samek, Wojciech |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9578637/ https://www.ncbi.nlm.nih.gov/pubmed/36256665 http://dx.doi.org/10.1371/journal.pone.0274291 |
Ejemplares similares
-
Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models
por: Alam, Minhaj Nur, et al.
Publicado: (2023) -
An Improved Math Word Problem (MWP) Model Using Unified Pretrained Language Model (UniLM) for Pretraining
por: Zhang, Dongqiu, et al.
Publicado: (2022) -
Medical image captioning via generative pretrained transformers
por: Selivanov, Alexander, et al.
Publicado: (2023) -
Pretrained Transformer Language Models Versus Pretrained Word Embeddings for the Detection of Accurate Health Information on Arabic Social Media: Comparative Study
por: Albalawi, Yahya, et al.
Publicado: (2022) -
Critical Analysis of Deconfounded Pretraining to Improve Visio-Linguistic Models
por: Cornille, Nathan, et al.
Publicado: (2022)