Research Preprint: “AI Hallucinations: A Misnomer Worth Clarifying”
The article (preprint) linked below was recently shared on arXiv.
Title
AI Hallucinations: A Misnomer Worth Clarifying
Authors
Negar Maleki
University of South Florida
Balaji Padmanabhan
University of South Maryland
Kaushik Dutta
University of South Florida
Source
via arXiv
DOI: 10.48550/arXiv.2401.06796
Abstract
As large language models continue to advance in Artificial Intelligence (AI), text generation systems have been shown to suffer from a problematic phenomenon termed often as “hallucination.” However, with AI’s increasing presence across various domains including medicine, concerns have arisen regarding the use of the term itself. In this study, we conducted a systematic review to identify papers defining “AI hallucination” across fourteen databases. We present and analyze definitions obtained across all databases, categorize them based on their applications, and extract key points within each category. Our results highlight a lack of consistency in how the term is used, but also help identify several alternative terms in the literature. We discuss implications of these and call for a more unified effort to bring consistency to an important contemporary AI issue that can affect multiple domains significantly.
Direct to Full Text Article
33 pages; PDF.
Filed under: Journal Articles, News
About Gary Price
Gary Price (gprice@gmail.com) is a librarian, writer, consultant, and frequent conference speaker based in the Washington D.C. metro area. He earned his MLIS degree from Wayne State University in Detroit. Price has won several awards including the SLA Innovations in Technology Award and Alumnus of the Year from the Wayne St. University Library and Information Science Program. From 2006-2009 he was Director of Online Information Services at Ask.com.