Cargando…

What does Chinese BERT learn about syntactic knowledge?

Pre-trained language models such as Bidirectional Encoder Representations from Transformers (BERT) have been applied to a wide range of natural language processing (NLP) tasks and obtained significantly positive results. A growing body of research has investigated the reason why BERT is so efficient...

Descripción completa

Detalles Bibliográficos
Autores principales: Zheng, Jianyu, Liu, Ying
Formato: Online Artículo Texto
Lenguaje:English
Publicado: PeerJ Inc. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10403162/
https://www.ncbi.nlm.nih.gov/pubmed/37547407
http://dx.doi.org/10.7717/peerj-cs.1478