Cargando…

Heuristic Attention Representation Learning for Self-Supervised Pretraining

Recently, self-supervised learning methods have been shown to be very powerful and efficient for yielding robust representation learning by maximizing the similarity across different augmented views in embedding vector space. However, the main challenge is generating different views with random crop...

Descripción completa

Detalles Bibliográficos
Autores principales: Tran, Van Nhiem, Liu, Shen-Hsuan, Li, Yung-Hui, Wang, Jia-Ching
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9320898/
https://www.ncbi.nlm.nih.gov/pubmed/35890847
http://dx.doi.org/10.3390/s22145169