Volokh Conspiracy: Apparent AI Hallucinations in AI Misinformation Expert’s Court Filing Supporting Anti-AI-Misinformation Law

Eugene Volokh at the Volokh Conspiracy covers HLLI’s motion to exclude testimony from an “AI and misinformation” expert in Kohls v. Ellison. The expert ironically cited fictional journal articles in his declaration, which were apparently “hallucinated” by an AI model like ChatGPT.

Minnesota recently enacted a law aimed at restricting misleading AI deepfakes aimed at influencing elections; the law is now being challenged on First Amendment grounds in Kohls v. Ellison. To support the law, the government defendants introduced an expert declaration, written by a scholar of AI and misinformation, who is the Faculty Director of the Stanford Internet Observatory.

But the plaintiffs’ memorandum in support of their motion to exclude the expert declaration alleges—apparently correctly—that this study “does not exist”:

No article by the title exists. The publication exists, but the cited pages belong to unrelated articles. Likely, the study was a “hallucination” generated by an AI large language model like ChatGPT….

The “doi” url is supposed to be a “Digital Object Identifier,” which academics use to provide permanent links to studies. Such links normally redirect users to the current location of the publication, but a DOI Foundation error page appears for this link: “DOI NOT FOUND.” … The title of the alleged article, and even a snippet of it, does not appear on anywhere on the internet as indexed by Google and Bing, the most commonly-used search engines. Searching Google Scholar, a specialized search engine for academic papers and patent publications, reveals no articles matching the description of the citation authored by “Hwang” that includes the term “deepfake.” …

This sort of citation—with a plausible-sounding title, alleged publication in a real journal, and fictitious “doi,” is characteristic of an artificial intelligence “hallucination,” which academic researchers have warned their colleagues about. See Goddard, J, Hallucinations in ChatGPT: A Cautionary Tale for Biomedical Researchers (2023) ….

Volokh also identified a second citation in the expert declaration that does not exist.

Read more at Reason.com.

Search this website Type then hit enter to search