TwinLadder logoTwinLadder
TwinLadder
TwinLadder logoTwinLadder
Back to Archive
TwinLadder Intelligence
Issue #22

TwinLadder Weekly

December 2025

TwinLadder Weekly

Issue #22 | December 2025


ABA Task Force Report: AI Is Now Infrastructure

Official verdict: AI adoption is no longer optional. What the 56-page assessment means for your 2026 planning.


On December 15, 2025, the American Bar Association's Task Force on Law and Artificial Intelligence released its final report. After two years of study, the profession's leading voice has delivered its assessment: we've reached a "pivotal moment."

This issue: what the report actually says, and what it means for practitioners.

The Core Finding

The report's title tells the story: "Addressing the Legal Challenges of AI: Year 2 Report on the Impact of AI on the Practice of Law."

Former ABA President William R. Bay's introduction sets the tone: "AI is no longer an abstract concept. AI has become key to reshaping the way we practice, serve our clients, and safeguard the rule of law."

That's not aspirational language. It's assessment of current reality.

The "Pivotal Moment" Assessment

The Task Force identifies where we stand: AI adoption has surpassed understanding.

LawSites' analysis captured the central finding: "The majority of legal professionals now use AI but do not fully appreciate the practical and ethical challenges that arise when using AI."

This is the profession's core problem for 2026: widespread adoption without commensurate competence.

What the Report Covers

The 56-page assessment examines:

  1. The Rule of Law - How AI affects fundamental legal principles
  2. The Courts - Judicial AI use and procedural implications
  3. Legal Education - How law schools are adapting
  4. Access to Justice - AI's potential to expand legal help
  5. Risks and Challenges - Including hallucination and accountability
  6. Bar Ethics Rules - Guidance landscape across jurisdictions

What the Report Concludes

The profession has reached a crossroads. The Task Force warns against becoming "so focused on short-term implementation challenges that it neglects the longer-term implications of increasingly powerful AI systems."

Several contributors warn that sudden advances toward human-level or super-human AI capabilities could leave legal institutions unprepared.

The Five Key Takeaways

1. AI Is Infrastructure, Not Experiment

The framing shift matters. When AI was "experimental," adoption was optional. When AI is "infrastructure," non-adoption becomes a competency issue.

The report doesn't mandate AI use. But it establishes that understanding AI is now part of professional competence—whether you use it or not.

2. Access to Justice Shows Real Progress

The report's most optimistic section focuses on access to justice. Key findings:

  • 100+ documented AI use cases in legal aid settings
  • 74% of legal aid organizations now use AI (nearly double the profession average)
  • Generative AI demonstrates "real potential to expand access to legal help"

Self-represented litigants are receiving AI-assisted guidance. Legal aid productivity is improving. This isn't theoretical—it's measurable.

3. Legal Education Is Adapting

Law schools responded faster than many expected:

Metric Finding
Schools offering AI courses 55%
Schools with hands-on AI experiences 83%
Schools requiring AI certification Case Western (1L requirement)

The next generation of lawyers will enter practice with AI literacy as baseline competency. Current practitioners face a gap.

4. The Risks Are Real and Growing

The report doesn't minimize challenges:

  • Hallucination rates remain concerning
  • Accountability frameworks are still developing
  • Ethical guidance varies by jurisdiction
  • Competence requirements are evolving

The profession is using tools it doesn't fully understand. The report calls for closing this gap urgently.

5. Responsibility Transfers to the ABA Center for Innovation

This is the Task Force's final report. Ongoing work transfers to the ABA Center for Innovation, which will:

  • Carry forward findings and recommendations
  • Develop implementation guidance
  • Track evolving landscape
  • Support jurisdictional adaptation

The assessment is complete. Implementation begins.

What This Means for Mid-Market Practitioners

The Competence Imperative

If the ABA says AI is infrastructure, bar associations will follow. Expect:

  • CLE requirements expanding (New York already mandates AI credits)
  • Ethics opinions proliferating (30+ states have issued guidance)
  • Malpractice implications for AI-related failures

You don't have to use AI. You do have to understand it well enough to make informed decisions about when to use it and when not to.

The Governance Necessity

80% of Am Law 100 firms have AI governance boards. The report's findings will pressure mid-market firms to formalize their approaches.

If you don't have an AI policy, you need one. Not because it's trendy—because it's becoming the standard of care.

The Training Priority

The report's finding that "adoption has surpassed understanding" describes many mid-market firms precisely. Tools were adopted without systematic training. The gap must close.

Budget for AI training in 2026. It's not optional professional development—it's competence maintenance.

What the Report Doesn't Solve

The Pricing Problem

The report notes AI's potential but doesn't address affordability. Harvey at $1,200/seat/month. Enterprise tools requiring significant investment. The mid-market gap remains.

The Verification Burden

Hallucination risks are acknowledged. The solution—human verification—is labor-intensive. No breakthrough in autonomous reliable legal AI is projected.

The Regulatory Fragmentation

30+ state bar opinions means 30+ different frameworks. The ABA provides guidance; it can't mandate uniformity. Multi-jurisdictional practitioners face compliance complexity.


Tool Review: Report-Referenced AI Applications

The tools and applications discussed in the ABA Task Force Report

Legal Research AI

Report Assessment: Case law citation remains highest-risk area for hallucination. Commercial legal research tools outperform general-purpose AI but still require verification.

Current Leaders:

  • Harvey (highest VLAIR benchmark scores)
  • CoCounsel (Thomson Reuters ecosystem)
  • Lexis+ AI (improving rapidly)

Report Recommendation: Every AI-assisted research output requires human verification of citations before filing. No exceptions.


Legal Aid Applications

Report Assessment: Perhaps the most optimistic area. 100+ documented use cases. 74% adoption rate among legal aid organizations.

Key Applications:

  • Intake automation
  • Document generation for common matters
  • Legal information delivery to self-represented litigants
  • Translation and accessibility tools

Report Finding: AI is beginning to demonstrate "real potential to expand access to legal help."


Judicial AI

Report Assessment: Courts are grappling with AI use by litigants and exploring AI for judicial functions.

Current Status:

  • ABA Working Group published guidelines for judicial AI use
  • Courts developing disclosure requirements
  • Sanctions for AI misuse increasing

Report Caution: Judicial AI use requires particular care given due process implications.


The Honest Assessment

The report is descriptive more than prescriptive. It documents where the profession stands without mandating specific tools or approaches.

For practitioners, the takeaway is clear: understand the landscape, develop governance, build competence. Specific tool choices remain your decision.


What's Working: Report Success Stories

Success Story: Legal Aid Transformation

From the Report: Over 100 documented AI use cases in legal aid settings demonstrate tangible progress since the first-year assessment.

What's Working:

  • Intake automation reducing administrative burden
  • Document generation for routine matters
  • Information delivery to self-represented litigants
  • Productivity improvements allowing more clients served

Why It Matters: The organizations with the least resources moved fastest toward AI adoption. Access to justice benefits are real.


Success Story: Law School Adaptation

From the Report: 55% of law schools offer AI courses; 83% provide hands-on experiences; Case Western requires AI certification for all 1Ls.

What's Working:

  • Curriculum integration happening faster than expected
  • Hands-on experiences (clinics, labs) supplementing theory
  • Certification programs establishing competency standards

Why It Matters: The talent pipeline is adapting. New lawyers will enter practice AI-literate.


Success Story: Jurisdictional Guidance Proliferation

From the Report: Rapid increase in formal guidance across jurisdictions reflects growing consensus that AI can be used responsibly.

What's Working:

  • 30+ state bars have issued guidance
  • ABA Formal Opinion 512 provides ethical framework
  • Court rules developing for AI disclosure

Why It Matters: Regulatory uncertainty is decreasing. Practitioners have clearer guidance on responsible use.


Hard Cases: What the Report Acknowledges as Unresolved

Hard Case #1: The Competence Gap

The Problem: Majority of legal professionals use AI but don't fully appreciate practical and ethical challenges.

Report Assessment: This gap creates risk—both for clients and for practitioners facing accountability.

The Challenge: Closing competence gaps across an entire profession takes time. Meanwhile, AI capabilities continue advancing.

What's Needed: Systematic training, not just tool procurement.


Hard Case #2: The Hallucination Persistence

The Problem: AI systems generate fabricated citations and incorrect information at concerning rates.

Report Assessment: Case law citation is highest-risk area. Human verification remains essential.

The Challenge: No breakthrough in fully reliable autonomous legal AI is imminent. The verification burden persists.

What's Needed: Robust verification workflows; acceptance that human judgment remains essential.


Hard Case #3: The Long-Term Implications

The Problem: Short-term implementation challenges dominate attention; long-term implications receive insufficient focus.

Report Warning: Several contributors warn that sudden advances toward human-level AI could leave legal institutions unprepared.

The Challenge: Planning for scenarios that haven't arrived yet.

What's Needed: Strategic thinking beyond immediate tool adoption.


Reliability Corner

ABA Task Force Key Findings Summary

Area Finding
Overall Status "Pivotal moment" - AI is now infrastructure
Adoption vs. Understanding Adoption has surpassed understanding
Legal Education 55% offer AI courses; 83% hands-on experiences
Legal Aid 100+ documented use cases; 74% adoption
Risks Hallucination, accountability, competence gaps
Future Responsibility Transfers to ABA Center for Innovation

Report Context

This is the Task Force's second and final report. The first-year report established baseline; this report assesses progress and identifies ongoing challenges.

The shift in framing—from "emerging technology" to "infrastructure"—reflects genuine change in the profession over two years.


Workflow of the Month: AI Infrastructure Readiness Assessment

Use this framework to assess your firm's readiness against the ABA's "infrastructure" framing:

AI INFRASTRUCTURE READINESS ASSESSMENT
======================================

FIRM: _________________________________
ASSESSMENT DATE: ______________________
CONDUCTED BY: _________________________

PART 1: COMPETENCE BASELINE
---------------------------

For each attorney, assess AI competence (1-5 scale):

Name: _________________
[ ] Understanding of AI capabilities: ___/5
[ ] Understanding of AI limitations: ___/5
[ ] Familiarity with firm AI tools: ___/5
[ ] Familiarity with ethical guidance: ___/5
[ ] Training completed (past 12 mo): YES / NO

(Repeat for all attorneys)

Firm-wide average competence score: ___/5

PART 2: GOVERNANCE STATUS
-------------------------

Written AI Policy:
[ ] Exists and current (review date: _______)
[ ] Exists but outdated
[ ] In development
[ ] Does not exist

AI Governance Structure:
[ ] Formal committee/board with authority
[ ] Informal oversight arrangement
[ ] Individual attorney discretion
[ ] No governance structure

Policy Addresses:
[ ] Approved AI tools
[ ] Prohibited uses
[ ] Verification requirements
[ ] Client communication obligations
[ ] Confidentiality protections
[ ] Supervision requirements
[ ] Training requirements

Gap identified: ________________________

PART 3: ETHICAL COMPLIANCE
--------------------------

Jurisdiction-Specific Guidance Reviewed:
[ ] ABA Formal Opinion 512
[ ] State bar ethics opinion(s): _________
[ ] Court disclosure requirements: _______
[ ] CLE requirements identified

Compliance Status:
[ ] Fully compliant with all applicable guidance
[ ] Substantially compliant (gaps: _________)
[ ] Partial compliance
[ ] Non-compliant / assessment not completed

PART 4: OPERATIONAL READINESS
-----------------------------

Current AI Tools in Use:
Tool 1: _____________ Purpose: _____________
Tool 2: _____________ Purpose: _____________
Tool 3: _____________ Purpose: _____________

Verification Workflows:
[ ] Documented verification procedures exist
[ ] Verification is ad hoc/individual discretion
[ ] No verification procedures

Quality Control:
[ ] AI output review is systematic
[ ] AI output review is periodic/sampling
[ ] AI output review is minimal/none

PART 5: GAP ANALYSIS
--------------------

Competence Gaps:
1. _____________________________________
2. _____________________________________

Governance Gaps:
1. _____________________________________
2. _____________________________________

Operational Gaps:
1. _____________________________________
2. _____________________________________

PART 6: REMEDIATION PRIORITIES
------------------------------

High Priority (address within 30 days):
1. _____________________________________
2. _____________________________________

Medium Priority (address within 90 days):
1. _____________________________________
2. _____________________________________

Low Priority (address within 6 months):
1. _____________________________________
2. _____________________________________

Budget Required: $_______________________
Training Hours Needed: __________________
External Resources Needed: ______________

ASSESSMENT APPROVED BY: ________________
DATE: ________________________________

Time investment: 3-4 hours for thorough assessment Why it matters: The ABA says AI is infrastructure. This assessment tells you if your firm is ready for that reality.


Quick Hits

Report Highlights:

Key Findings:

  • 55% of law schools offer AI courses
  • 100+ documented AI use cases in legal aid
  • Adoption has surpassed understanding
  • Long-term implications require more attention

Coming Next Issue:

  • 80% of Am Law 100 Have AI Governance Boards: Do You?

Ask the Community

The ABA's report raises questions for every firm:

  1. How does your firm's AI competence compare to the adoption level the report describes?
  2. Has your state bar issued AI ethics guidance? How has it affected your practice?
  3. What's your biggest gap—competence, governance, or operational readiness?
  4. How are you planning to address the report's findings in 2026?

Reply to share. Anonymized contributions welcome.


TwinLadder Weekly | Issue #22 | December 2025

Helping lawyers build AI capability through honest education.


Sources