Cargando…
Fabrication and errors in the bibliographic citations generated by ChatGPT
Although chatbots such as ChatGPT can facilitate cost-effective text generation and editing, factually incorrect responses (hallucinations) limit their utility. This study evaluates one particular type of hallucination: fabricated bibliographic citations that do not represent actual scholarly works....
Autores principales: | Walters, William H., Wilder, Esther Isabelle |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10484980/ https://www.ncbi.nlm.nih.gov/pubmed/37679503 http://dx.doi.org/10.1038/s41598-023-41032-5 |
Ejemplares similares
-
To ChatGPT or not to ChatGPT: the use of artificial intelligence in writing scientific papers
por: Marescotti, Manuela
Publicado: (2023) -
ChatGPT: these are not hallucinations – they’re fabrications and falsifications
por: Emsley, Robin
Publicado: (2023) -
The potential role of ChatGPT and artificial intelligence in anatomy education: a conversation with ChatGPT
por: Totlis, Trifon, et al.
Publicado: (2023) -
ChatGPT in Clinical Toxicology
por: Sabry Abdel-Messih, Mary, et al.
Publicado: (2023) -
Authorship Policy and ChatGPT
por: Kleebayoon, Amnuay, et al.
Publicado: (2023)