Use AI to support your work—not to replace your professional judgment

Posted on February 3, 2026

As Artificial Intelligence tools (such as ChatGPT, Harvey, CoPilot, Claude, etc.) become more accessible, many attorneys have begun exploring how to best use AI tools in their practices. While such tools can streamline workflows, there are many recent examples showing that AI is not a substitute for professional judgment and its misuse carries real consequences, including:

The response from the Bench has been clear: the duty of candor, Rule 11 obligations, and responsibility for accuracy do not change simply because AI tools are used. A magistrate judge in the Eastern District of Oklahoma recently updated his AI guidelines to enforce the fact that technology is not the issue; “truth” is. Magistrate Judge Jason A. Robertson emphasizes his requirement to comply with Rule 11 and states that “human verification” of all cited authority is required. See https://www.oked.uscourts.gov/sites/oked/files/AI%20Guidelines%20JAR%201.06.26.pdf.

As attorneys, we must stay vigilant.

  • Verify every citation and quotation manually, especially if AI assisted in locating them. AI does not lessen an attorney’s responsibility to ensure accuracy and truthfulness.
  • Remember that responsibility is non-delegable. Courts evaluate the product of your signature, not the tools behind it. Be sure to check the Court’s (and the specific Judge’s) rules on candor and AI use.
  • Implement internal safeguards, such as assigning someone to double-check all writing and citations and/or requiring AI training and certification for all attorneys and staff.

Generative AI is here to stay, and its benefits are undeniable. But these tools are only as reliable as the scrutiny applied to their output. Used wisely, AI can support our craft. Used carelessly, it can threaten our credibility, our clients’ interests, and the integrity of the judicial process.