Cargando…
Heuristic Attention Representation Learning for Self-Supervised Pretraining
Recently, self-supervised learning methods have been shown to be very powerful and efficient for yielding robust representation learning by maximizing the similarity across different augmented views in embedding vector space. However, the main challenge is generating different views with random crop...
Autores principales: | Tran, Van Nhiem, Liu, Shen-Hsuan, Li, Yung-Hui, Wang, Jia-Ching |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9320898/ https://www.ncbi.nlm.nih.gov/pubmed/35890847 http://dx.doi.org/10.3390/s22145169 |
Ejemplares similares
-
Self-Supervised Learning-Based General Laboratory Progress Pretrained Model for Cardiovascular Event Detection
Publicado: (2023) -
Self-supervised pretraining improves the performance of classification of task functional magnetic resonance imaging
por: Shi, Chenwei, et al.
Publicado: (2023) -
MAE-Based Self-Supervised Pretraining Algorithm for Heart Rate Estimation of Radar Signals
por: Xiang, Yashan, et al.
Publicado: (2023) -
Self-Supervised Learning Framework toward State-of-the-Art Iris Image Segmentation
por: Putri, Wenny Ramadha, et al.
Publicado: (2022) -
Contrastive learning-based pretraining improves representation and transferability of diabetic retinopathy classification models
por: Alam, Minhaj Nur, et al.
Publicado: (2023)