“Frankly, that’s shoddy work,” he said. “We deserve better.”

I am just leaving the Association of Health Care Journalist meeting in LA and I have a feeling that all these good editors would not approve of the use of AI for made up facts and citations.The Washington Post reports this AM: Some of the citations that underpin the science in the White House’s sweeping “MAHA Report” appear to have been generated using artificial intelligence, resulting in numerous garbled scientific references and invented studies, AI experts said Thursday.

Of the 522 footnotes to scientific research in an initial version of the report sent to The Washington Post, at least 37 appear multiple times, according to a review of the report by The Post. Other citations include the wrong author, and several studies cited by the extensive health report do not exist at all, a fact first reported by the online news outlet NOTUS on Thursday morning.

Some references include “oaicite” attached to URLs — a definitive sign that the research was collected using artificial intelligence. The presence of “oaicite” is a marker indicating use of OpenAI, a U.S. artificial intelligence company.

A common hallmark of AI chatbots, such as ChatGPT, is unusually repetitive content that does not sound human or is inaccurate — as well as the tendency to “hallucinate” studies or answers that appear to make sense but are not real.

AI technology can be used legitimately to quickly survey the research in a field. But Oren Etzioni, a professor emeritus at the University of Washington who studies AI, said he was shocked by the sloppiness in the MAHA Report.

“Frankly, that’s shoddy work,” he said. “We deserve better.”