EOIR Clarifies Rules on Generative AI Use in Immigration Courts

EOIR clarifies generative AI useThe Executive Office for Immigration Review (EOIR) issued Policy Memorandum PM-25-40 (effective August 8, 2025). This memo provides guidance on parties’ use of generative artificial intelligence (AI) in immigration court and Board of Immigration Appeals proceedings. EOIR did not adopt a blanket ban, nor did it impose a universal disclosure requirement. However, the memo warns attorneys that improper use of generative AI (for example, submitting hallucinated citations or fabricated facts) can implicate professional-conduct rules. This may lead to discipline. 

What EOIR’s memorandum says 

First, EOIR states that parties may use generative AI tools in preparing filings and other materials. However, they must remain responsible for the accuracy and reliability of any content gave in to the court. Next, EOIR expressly notes that it is not imposing a mandatory, nationwide disclosure requirement about whether a filing or brief was prepared using generative AI. Nevertheless, individual judge (immigration judges or the BIA) retain authority to adopt local or case-specific standing orders. These orders may require disclosure or limit AI use. 

EOIR generative AI guidance immigration court

 

By Brian D. Lerner — Plain-English summary of EOIR Policy Memorandum PM-25-40 on generative AI, what attorneys must know about disclosure and ethical duties, and practical steps for counsel and judges.

Key practical warnings for attorneys

EOIR emphasizes professional-responsibility risks: attorneys who submit AI-generated material that contains fabrications (“hallucinations”), fake citations, or incorrectly summarized evidence may violate rules of professional conduct and face referral to EOIR’s Attorney Discipline Program or Anti-Fraud Program. Therefore, the memo cautions counsel to verify all AI outputs, confirm citations and authorities exist, and not to rely on generative tools as a substitute for independent legal research and fact-checking. 

What judges and courts can do

Although EOIR did not impose a single mandatory rule, it expressly left room for immigration judges and the BIA to adopt standing orders or local rules imposing disclosure or other limitations on AI use in their courts. In practice, expect some immigration courts to issue standing orders requiring parties to disclose whether a pleading or exhibit was prepared with AI, while others may take a more permissive approach until further rulemaking. EOIR also flagged the possibility of future rulemaking and anticipated Department-wide guidance before the end of 2025. 

Why this matters — risks & opportunities

On the plus side, generative AI can help streamline drafting, summarize voluminous records, or translate non-English materials provisionally — potentially saving time. However, the technology also poses real risks: fabricated case citations, misstatements of law, or invented factual detail can mislead judges and harm clients, and they can trigger professional discipline or motions to strike. Consequently, attorneys must adopt strict verification workflows before submitting any AI-assisted work. 

Practical checklist — what lawyers should do now

  1. Verify everything: Confirm that every case citation, quotation, and statutory quote from an AI draft actually exists and is accurate.
  2. Document human review: Keep contemporaneous notes showing the attorney reviewed and verified AI outputs (time-stamped drafts, research logs, or an attestation file).
  3. Avoid overreliance: Use generative AI for drafting or brainstorming only after independent legal research confirms the substance.
  4. Expect local orders: Check the immigration court’s standing orders before filing; be ready to comply with any AI-disclosure requirements. 
  5. Secure sensitive data: Do not upload confidential client information or court-sealed records to consumer AI services — protect privilege and PII. 
  6. If you find AI errors in opposing filings: Consider a motion to strike, a request for clarification, or, if the error is material and intentional, a referral to the Attorney Discipline Program. 

Suggested short attestation 

Some courts may accept a short attestation appended to filings to document human verification. Consider this template (adapt to local rules):
"Counsel for [Party] certifies that, to the best of counsel's knowledge after reasonable inquiry, all citations, quotations, and factual representations in this filing have been independently verified and are accurate. Portions of this filing were prepared with assistance from a generative AI drafting tool; counsel reviewed and verified the content prior to submission."

If a judge orders disclosure — how to respond

If an immigration judge issues a standing order requiring disclosure of AI use, comply promptly and include a short explanation of the tool used and the steps taken to verify content. Where disclosure would reveal privileged strategy or confidential material, confer with the court on a protective procedure (in-camera submission or redaction) rather than ignoring the order. 

Technology & court operations — EOIR’s own use

EOIR’s memorandum also contemplates agency use of AI and flags cybersecurity concerns for official systems: EOIR warns federal employees not to use generative AI on government equipment in ways that may expose sensitive data. The agency indicated it may adopt internal AI controls while concurrently seeking broader rulemaking on public use. 

Copy-pasteable preservation checklist

  • Retain original AI drafts and the human-review notes showing how errors (if any) were corrected.
  • If opposing counsel used AI and you suspect hallucinations, preserve the filing and any associated metadata and move to strike or request judicial notice as appropriate.
  • Log any communications with the court about AI disclosure and keep copies of standing orders for the client file.

FAQs — short answers you can copy to clients

Does EOIR ban AI in immigration court?

No — EOIR did not enact a blanket ban. It allows AI use but warns counsel to verify outputs and notes judges may set local rules. 

Must I disclose that I used AI to draft a brief?

Not under a nationwide EOIR rule; however, individual judges may require disclosure by standing order, so check court rules and be prepared to disclose if ordered. 

What if AI makes up a case or a quote?

That is called a “hallucination.” If submitted, it can lead to sanctions or discipline — correct the record immediately and consider appropriate motions or referrals. 

Contact Form