Law & Crime reported on our case, Kohls v. Ellison, where the state of Minnesota’s expert submitted evidence citing AI-generated studies that do not exist.
Hancock’s declaration supporting the Minnesota statute cited numerous academic works, but according to a motion filed by Kohls and Franson, one of the studies cited was a “hallucination” made up by ChatGPT.
According to the filing, “the Declaration of Prof. Jeff Hancock cites a study that does not exist. No article by the title exists. The publication exists, but the cited pages belong to unrelated articles. Likely, the study was a “hallucination” generated by an AI large language model like ChatGPT.”
Read more at Law & Crime.