News

Former Trump Attorney Michael Cohen Used AI to Generate Fake Case Citations – American Faith

Former Trump attorney Michael Cohen admitted that he submitted fake citations to the court.

The citations created a precedent for his motion to end his supervised release early.

Google’s AI, Bard, created the citations.

Cohen has been under court supervision since 2021, when he completed his three-year prison sentence for tax evasion and campaign finance violations.

In a December 28 email to Judge Jesse Furman, who questioned the validity of the citations, a representative for Cohen wrote, “To summarize: Mr. Cohen provided Mr. Schwartz with citations (and case summaries) he had found online and believed to be real. Mr. Schwartz added them to the motion but failed to check those citations or summaries. As a result, Mr. Schwartz mistakenly filed a motion with three citations that—unbeknownst to either Mr. Schwartz or Mr. Cohen at the time—referred to nonexistent cases.”

“Upon later appearing in the case and reviewing the previously-filed motion, I discovered the problem and, in Mr. Cohen’s reply letter supporting that motion, I alerted the Court to likely issues with Mr. Schwartz’s citations and provided (real) replacement citations supporting the very same proposition. ECF No. 95 at 3,” the email continued. “To be clear, Mr. Cohen did not know that the cases he identified were not real and, unlike his attorney, had no obligation to confirm as much. While there has been no implication to the contrary, it must be emphasized that Mr. Cohen did not engage in any misconduct.”

“The invalid citations at issue—and many others that Mr. Cohen found but were not used in the motion—were produced by Google Bard, which Mr. Cohen misunderstood to be a supercharged search engine, not a generative AI service like Chat-GPT,” the statement added.

The representative then appears to blame Cohen’s lawyer, David Schwartz, who initially failed to catch the fake citations.

“Mr. Cohen is not a practicing attorney and has no concept of the risks of using AI services for legal research–nor does he have an ethical obligation to verify the accuracy of his research,” the representative wrote. “Mr. Schwartz, conversely, did have an obligation to verify the legal representations being made in a motion he filed. Unfortunately, Mr. Schwartz did not fulfill that obligation—as he was quick to admit, to his credit.”

Schwartz said in a statement that he “failed to review what I thought was the research of another attorney.”

He noted that he “never contemplated that the cases were non-existent.”

Previous ArticleNext Article