golden gavel

One of Wall Street’s most prestigious and oldest law firms found itself in an uncomfortable spotlight, not for a high-stakes deal or landmark case, but for submitting a court filing riddled with AI-generated errors, including fabricated case citations. Sullivan & Cromwell (S&C), a firm known for representing major corporations, high-profile clients (including President Trump in certain appeals), and handling complex restructurings, had to issue a formal apology to a federal bankruptcy judge after opposing counsel caught the mistakes.

The incident is a stark reminder: no matter how respected or well-resourced a firm is, artificial intelligence is not a substitute for human diligence. When lawyers cut corners or fail to follow basic training protocols, AI’s notorious “hallucinations” can lead to professional embarrassment, potential sanctions, and wasted court time.

What Happened?

The errors surfaced in an emergency motion filed on or around April 8–9, 2026, in a Chapter 15 bankruptcy case in the U.S. Bankruptcy Court for the Southern District of New York (Manhattan).

On April 18, partner Andrew Dietderich (co-head of the firm’s global restructuring group) sent a letter to Chief Judge Martin Glenn. In it, the firm admitted the motion contained “inaccurate citations and other errors,” many of which were AI “hallucinations” — instances where the tool fabricated case citations, misquoted legal authorities, or invented non-existent sources.

The mistakes spanned roughly three pages and totaled around three dozen errors. Some were clerical and unrelated to AI, but many involved imaginary passages from real (or entirely made-up) cases. Opposing lawyers from Boies Schiller Flexner spotted the issues in a public filing and brought them to the court’s attention.

S&C quickly reviewed all other filings in the case, confirmed the problems were isolated to this one document, filed a corrected version, and expressed deep regret. “We deeply regret that this has occurred,” Dietderich wrote. Yeah . . . I bet they do!

Why This Matters: It’s Not Just a “Small Firm” Problem Anymore

This isn’t the first time lawyers have been burned by AI. Back in 2023, two attorneys were fined $5,000 after submitting a brief full of ChatGPT-invented cases in a New York federal court. Since then, a steady stream of sanctions, fines (sometimes reaching tens of thousands of dollars), and disciplinary actions have hit lawyers across the country for similar AI slop in filings.

What makes the Sullivan & Cromwell case notable is the firm’s stature. If a powerhouse like S&C, with its presumed rigorous standards, deep resources, and high-caliber talent, can fall into this trap, it underscores a broader truth: AI hallucinations are an industry-wide risk, not limited to solo practitioners or overwhelmed small firms.

The legal profession is in the middle of a reckoning. Generative AI promises to speed up research, drafting, and document review in an industry drowning in voluminous materials. But it also has a well-documented tendency to confidently invent facts, quotes, and citations when it doesn’t “know” the answer.

The Real Culprit: Laziness and Failure to Follow Training

At this time, no reporting has identified the specific AI tool used by S&C in this incident. We don’t know whether the tool was a general-purpose model (e.g., ChatGPT, Claude, Gemini) or a legal-specific platform (e.g., CoCounsel, Harvey, or an internal/custom tool).

That would be interesting information, but that’s not the real issue here.

Here’s the most telling detail from Dietderich’s letter: Sullivan & Cromwell requires its lawyers to complete training before accessing the firm’s AI tools. One of the core instructions in that training? “Trust nothing and verify everything.”

Yet, according to the firm, those policies “were not followed” in preparing this motion.This points directly to human factors:

  • Over-reliance on AI — Treating the tool as a magic research assistant instead of a first draft that demands rigorous checking.
  • Laziness or time pressure — Skipping the verification step because “AI got most of it right” or deadlines loomed.
  • Poor supervision — Failing to have senior lawyers or dedicated reviewers double-check AI-assisted work with the same scrutiny applied to traditional research.

The American Bar Association (ABA) has been clear on this. In its Formal Opinion 512 (issued in 2024), the ABA stresses that lawyers must maintain technological competence under Model Rule 1.1. This includes understanding the benefits and limitations of AI tools, especially their propensity for hallucinations. Lawyers remain fully responsible for the accuracy of everything they file — AI output doesn’t get a pass. Supervisory responsibilities (Rules 5.1 and 5.3) also apply: partners must ensure anyone using AI (lawyers or support staff) does so ethically and accurately.

Lessons for Lawyers, Firms, and Anyone Using AI

The Sullivan & Cromwell incident offers several clear takeaways:

  • Always verify — “Trust but verify” isn’t enough. For legal citations, treat AI like an unreliable intern: assume it might be wrong until you’ve independently confirmed every reference in primary sources.
  • Training alone isn’t enough — Firms must enforce policies with real accountability, audits, or second-review processes for AI-generated content.
  • AI is a tool, not a lawyer — It can summarize, brainstorm, or draft, but it cannot replace professional judgment, ethical duties, or candor to the tribunal.
  • The cost of cutting corners is rising — Courts are increasingly impatient with AI slop. Sanctions, fines, reputational damage, and even bar discipline are becoming more common.

This case also highlights a growing “vigilante” dynamic: opposing counsel are actively scanning filings for AI red flags and calling them out publicly.

Final Thoughts

AI is an incredible tool. Any experienced lawyer who has used AI to assist in drafting and editing contracts can see the immense power of the tool. But contract generally don’t have case citations, and they for the most part aren’t filed for judicial review.

Briefs and motions filed in court, on the other hand, present real risks for lawyers. Hallucinations happen, particularly with respect to case citations and quotes. And this is where you’ll get into trouble if you’re lazy.

So do the work, and don’t let this happen to you.