Cargando…
Research on Domain-Specific Knowledge Graph Based on the RoBERTa-wwm-ext Pretraining Model
The purpose of this study is to solve the effective way of domain-specific knowledge graph construction from information to knowledge. We propose the deep learning algorithm to extract entities and relationship from open-source intelligence by the RoBERTa-wwm-ext pretraining model and a knowledge fu...
Autores principales: | Liu, Xingli, Zhao, Wei, Ma, Haiqun |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9581622/ https://www.ncbi.nlm.nih.gov/pubmed/36275948 http://dx.doi.org/10.1155/2022/8656013 |
Ejemplares similares
-
Multi-Label Classification in Patient-Doctor Dialogues With the RoBERTa-WWM-ext + CNN (Robustly Optimized Bidirectional Encoder Representations From Transformers Pretraining Approach With Whole Word Masking Extended Combining a Convolutional Neural Network) Model: Named Entity Study
por: Sun, Yuanyuan, et al.
Publicado: (2022) -
A sui generis QA approach using RoBERTa for adverse drug event identification
por: Jain, Harshit, et al.
Publicado: (2021) -
RoBERTa-Assisted Outcome Prediction in Ovarian Cancer Cytoreductive Surgery Using Operative Notes
por: Laios, Alexandros, et al.
Publicado: (2023) -
First person – Roberta Besio
Publicado: (2019) -
First person – Roberta Azzarelli
Publicado: (2021)