TwinLadder logoTwinLadder
TwinLadder
TwinLadder logoTwinLadder
Back to Archive
TwinLadder Intelligence
Issue #23

TwinLadder Weekly

January 2026

TwinLadder Weekly

Issue #23 | January 2026


80% of Am Law 100 Have AI Governance Boards: Do You?

Enterprise firms are governing AI seriously. Mid-market needs to catch up.


As 2026 begins, a striking statistic demands attention: 80% of Am Law 100 firms have established AI governance boards. These aren't informal committees—they're formal oversight structures with real authority.

Meanwhile, most mid-market firms operate without any formal AI governance.

This issue: why governance matters, how to build it at any firm size, and what happens when you don't.

The Governance Gap

What the Numbers Show

Firm Size AI Governance Structure Source
Am Law 100 80% have formal boards Industry reports
Am Law 200 ~60% have formal structures Estimated
Mid-Market (50-200) ~25% have formal policies Survey data
Small Firms (10-50) ~10% have written policies Estimated
Solo/Small (<10) <5% have formal governance Estimated

The pattern is clear: governance correlates with firm size. But the risks don't.

Why Enterprise Firms Moved First

Large firms adopted governance early for several reasons:

  1. Scale of investment - Enterprise AI tools cost millions annually; governance protects that investment
  2. Risk exposure - More matters means more opportunities for AI-related failures
  3. Client demands - Sophisticated clients ask about AI policies
  4. Regulatory pressure - Large firms receive more regulatory scrutiny
  5. Resources - They can afford dedicated governance staff

Why Mid-Market Firms Haven't

Mid-market firms face different constraints:

  1. Resource limitations - No budget for governance-specific roles
  2. Tool diversity - Individual attorneys choose their own tools
  3. "It's working fine" - No major incidents yet
  4. Competing priorities - Business development feels more urgent
  5. Uncertainty - What should a governance policy even say?

The problem: all firms face the same accountability standards. Courts don't ask about firm size before imposing sanctions.

The Accountability Reality

Why Governance Matters Now

The Butler Snow Case (July 2025): A large, sophisticated firm with resources for proper safeguards still filed hallucinated citations. The court's sanctions underscored that firm sophistication provides no protection without actual governance.

Johnson v. Dunn (July 2025): The court declared monetary sanctions are "proving ineffective" and something more is needed. Governance failures are becoming aggravating factors.

The Pattern: Courts now evaluate:

  • Did the firm have AI policies?
  • Were verification protocols followed?
  • Was training provided?
  • Who supervised the work?

Having no governance structure is becoming a liability, not just an oversight.

The State Bar Dimension

Over 30 states have issued AI ethics guidance. Common themes:

  • Competence requires understanding AI tools
  • Supervision duties extend to AI-assisted work
  • Confidentiality obligations apply to AI inputs
  • Communication with clients about AI use expected

Governance frameworks help demonstrate compliance. Ad hoc approaches leave gaps.

Building Governance at Any Scale

The Core Principle

Governance doesn't require a dedicated board or expensive infrastructure. It requires:

  1. Documented policies - What's allowed, what's not, what's required
  2. Clear accountability - Who's responsible for AI decisions
  3. Verification procedures - How AI outputs are checked
  4. Training requirements - How competence is maintained
  5. Review mechanisms - How policies evolve with the technology

These elements can scale from solo practice to Am Law 100.

Enterprise Model (50+ Lawyers)

Structure:

  • Formal AI Governance Board (5-7 members)
  • Representatives from: Managing Partner, IT, Practice Groups, Ethics, Risk Management
  • Quarterly meetings minimum; emergency response authority
  • Budget authority for tools and training

Responsibilities:

  • Approve AI tool procurement
  • Set firm-wide AI policies
  • Oversee training programs
  • Review incidents and near-misses
  • Update policies as guidance evolves

Resource Requirement: 2-4 hours/week from designated leader; quarterly board time


Mid-Market Model (10-50 Lawyers)

Structure:

  • AI Committee (3-5 members)
  • Representatives from: Managing Partner or designee, Senior Associate, IT lead
  • Monthly meetings minimum
  • Reports to management committee

Responsibilities:

  • Maintain AI policy document
  • Coordinate training efforts
  • Track regulatory developments
  • Review significant incidents
  • Recommend tool decisions

Resource Requirement: 2-4 hours/month from committee members


Small Firm Model (2-10 Lawyers)

Structure:

  • Designated AI Coordinator (single attorney)
  • Annual policy review with all attorneys
  • Standing agenda item at firm meetings

Responsibilities:

  • Draft and maintain written AI policy
  • Track bar guidance for relevant jurisdictions
  • Coordinate training (can be informal)
  • Review any AI-related concerns
  • Recommend tool decisions to partners

Resource Requirement: 1-2 hours/month


Solo Practitioner Model

Structure:

  • Self-governance with documentation
  • Annual policy review
  • Peer consultation as needed

Responsibilities:

  • Written AI policy (even if simple)
  • Documented verification procedures
  • Annual competence self-assessment
  • Tracking of relevant bar guidance
  • Insurance coverage verification

Resource Requirement: 1-2 hours/quarter


The Essential Policy Elements

Regardless of firm size, your AI policy should address:

1. Tool Authorization

Specify:

  • Which AI tools are approved for use
  • Which are prohibited (and why)
  • Who can approve new tools
  • What evaluation criteria apply

Example Language: "Approved AI tools for firm use include: [List]. General-purpose AI (ChatGPT, Claude) may be used only for non-confidential work. Use of unapproved tools requires [approval process]."

2. Verification Requirements

Specify:

  • What outputs must be verified
  • How verification is documented
  • Who is responsible for verification
  • When verification can be delegated

Example Language: "All AI-generated legal research must be verified against primary sources before use. Every citation must be confirmed to exist, correctly state the holding, and retain precedential value. Verification must be documented in the work product file."

3. Confidentiality Protections

Specify:

  • What information cannot be input to AI systems
  • Which tools have acceptable data handling
  • How client consent is obtained when needed
  • Data retention and deletion requirements

Example Language: "Client-identifying information, privileged communications, and confidential business information shall not be input into AI systems unless the system has been approved for confidential data and the client has consented to such use."

4. Supervision Requirements

Specify:

  • Who supervises AI-assisted work
  • What level of review is required
  • How supervision is documented
  • When more senior review is triggered

Example Language: "AI-assisted work product is subject to the same supervision requirements as traditional work product. The supervising attorney must review and approve all AI-assisted deliverables before client delivery."

5. Training Requirements

Specify:

  • Initial training for new attorneys
  • Ongoing training requirements
  • How competence is assessed
  • Training documentation

Example Language: "All attorneys must complete AI competency training within 90 days of hire and annually thereafter. Training shall cover approved tools, firm policies, ethical obligations, and verification procedures."

6. Incident Response

Specify:

  • What constitutes an AI-related incident
  • Reporting requirements
  • Review procedures
  • Documentation requirements

Example Language: "Any AI-related error that affected or could have affected client matters must be reported to [governance body] within 24 hours. Near-misses should be reported for learning purposes without attribution."


Tool Review: Governance Support Resources

Resources for building and maintaining AI governance

ABA Resources

Formal Opinion 512 (July 2024) Provides ethical framework for AI use—competence, confidentiality, communication, candor requirements. Essential foundation for any policy.

Task Force Reports (2024-2025) Comprehensive assessments of AI in legal practice. Useful for understanding landscape and building buy-in.

Best For: Foundational ethical framework Cost: Free Rating: Essential - required reading for governance


State Bar Guidance

PAXTON maintains a comprehensive tracker of state bar AI guidance. Key jurisdictions:

  • California: Multi-jurisdictional compliance requirements
  • Pennsylvania: Court disclosure mandates
  • New York: CLE requirements
  • Texas: Opinion 705 framework
  • Oregon: Opinion 2025-205

Best For: Jurisdiction-specific compliance Cost: Free (most available online) Rating: Essential - know your jurisdiction's requirements


Policy Templates

Several organizations offer AI policy templates:

  • ABA Center for Innovation - General framework
  • Legal Tech Hub - Customizable templates
  • Practice management vendors - Tool-specific guidance

Best For: Starting point for policy drafting Cost: Free to moderate Rating: Helpful - customize to your firm's needs


Training Resources

Options for firm-wide AI training:

  • CLE providers - Formal courses for credit
  • Vendor training - Tool-specific education
  • Internal development - Firm-specific programs

Best For: Building competence systematically Cost: Variable ($0-500/person) Rating: Important - essential for governance implementation


What's Working: Governance Success Stories

Success Story: The Mid-Size Firm Playbook

Firm: 45-lawyer regional firm Challenge: Ad hoc AI use across practice groups; no coordination or standards

Approach:

  1. Formed 4-person AI Committee (Managing Partner + 3 practice group leads)
  2. Conducted firm-wide AI use survey
  3. Drafted policy addressing common concerns
  4. Rolled out with 2-hour all-attorney training
  5. Established monthly committee meetings

Result: Coordinated tool selection; consistent verification standards; two near-misses caught before client delivery.

Key Insight: Governance doesn't require major investment—it requires coordination and documentation.


Success Story: The Solo Practitioner Framework

Attorney: Solo family law practitioner Challenge: Using AI for research and drafting but worried about liability

Approach:

  1. Drafted one-page AI policy
  2. Created verification checklist for AI research
  3. Added AI disclosure language to engagement letters
  4. Documented training (self-study + CLE)
  5. Confirmed malpractice coverage

Result: Clear framework for responsible use; confidence in ethical compliance; documentation ready if ever questioned.

Key Insight: Solo practitioners can have governance. It just needs to scale appropriately.


Hard Cases: Governance Challenges

Hard Case #1: The Rogue Partner

Scenario: Senior partner refuses to follow AI policy; uses unapproved tools for client work.

Problem: Governance is meaningless without enforcement. But confronting senior partners carries firm politics risks.

Solution Approaches:

  • Frame as liability protection, not restriction
  • Document concerns formally
  • Consider insurance implications
  • Escalate to management committee if necessary

Lesson: Governance requires authority. Define enforcement mechanisms upfront.


Hard Case #2: The Resource Constraint

Scenario: Small firm knows it needs governance but has no bandwidth to develop policies.

Problem: "We'll get to it later" becomes permanent deferral.

Solution Approaches:

  • Start with minimal viable policy (one page)
  • Use available templates; don't reinvent
  • Designate governance as specific attorney's responsibility
  • Schedule annual review as recurring calendar item

Lesson: Imperfect governance is better than no governance. Start simple; iterate.


Hard Case #3: The Evolving Landscape

Scenario: Firm developed AI policy in 2024. By 2026, the tools and guidance have changed significantly.

Problem: Static policies become obsolete. But constant revision is exhausting.

Solution Approaches:

  • Build in annual review requirement
  • Designate someone to track major developments
  • Create process for interim updates when needed
  • Accept that perfection is impossible; currency is the goal

Lesson: Governance is ongoing, not one-time. Build maintenance into the structure.


Reliability Corner

AI Governance Adoption by Firm Size

Segment Governance Adoption Primary Gap
Am Law 100 80% formal boards Enforcement consistency
Am Law 101-200 ~60% formal Resource allocation
Mid-Market (50-200) ~25% written policy Any governance
Small (10-50) ~10% written policy Awareness of need
Solo/Small (<10) <5% written policy Time/priority

What Adequate Governance Includes

Element Minimum Standard Best Practice
Written policy Yes Regular updates
Tool authorization List exists Evaluation criteria
Verification requirements Documented Audited compliance
Training Annual Continuous + assessment
Supervision standards Defined Tiered by risk
Incident response Process exists Regular review

The Accountability Trend

Courts are increasingly evaluating governance as part of sanctions analysis. Having documented policies, training, and verification procedures provides defense. Having none is becoming aggravating factor.


Workflow of the Month: AI Governance Board Starter Kit

For firms ready to establish formal AI governance, use this implementation framework:

AI GOVERNANCE BOARD STARTER KIT
===============================

FIRM: _________________________________
IMPLEMENTATION LEAD: __________________
TARGET COMPLETION DATE: _______________

PHASE 1: FOUNDATION (Weeks 1-2)
-------------------------------

[ ] Secure management committee approval
    Approval date: ___________________
    Sponsor: _________________________

[ ] Define governance scope
    [ ] All AI tools
    [ ] Legal work AI only
    [ ] Client-facing only
    [ ] Other: _______________________

[ ] Allocate initial budget
    Amount: $_________________________
    For: _____________________________

PHASE 2: STRUCTURE (Weeks 2-4)
------------------------------

[ ] Appoint board/committee members
    Chair: ___________________________
    Members:
    1. ______________________________
    2. ______________________________
    3. ______________________________
    4. ______________________________
    5. ______________________________

[ ] Define authority
    [ ] Policy approval authority
    [ ] Tool procurement authority
    [ ] Training mandate authority
    [ ] Enforcement authority
    [ ] Budget authority

[ ] Set meeting cadence
    [ ] Weekly  [ ] Bi-weekly  [ ] Monthly  [ ] Quarterly
    First meeting date: ______________

PHASE 3: POLICY DEVELOPMENT (Weeks 4-8)
---------------------------------------

[ ] Current state assessment
    [ ] Survey attorneys on current AI use
    [ ] Inventory existing tools
    [ ] Review existing policies for AI gaps
    [ ] Identify high-risk practice areas

[ ] Draft core policy document
    Sections to include:
    [ ] Purpose and scope
    [ ] Definitions
    [ ] Tool authorization
    [ ] Verification requirements
    [ ] Confidentiality protections
    [ ] Supervision standards
    [ ] Training requirements
    [ ] Incident response
    [ ] Enforcement

[ ] Policy review process
    [ ] Practice group review
    [ ] Ethics review
    [ ] Management approval
    Approval date: ___________________

PHASE 4: IMPLEMENTATION (Weeks 8-12)
------------------------------------

[ ] Communicate policy
    [ ] All-firm meeting
    [ ] Written distribution
    [ ] Acknowledgment collection

[ ] Roll out training
    [ ] Training content developed
    [ ] Training schedule set
    [ ] Completion tracking established

[ ] Establish reporting mechanisms
    [ ] Incident reporting process
    [ ] Near-miss reporting process
    [ ] Question/clarification channel

[ ] Set up tools and systems
    [ ] Approved tool list published
    [ ] Access provisioned
    [ ] Verification templates created

PHASE 5: OPERATIONS (Ongoing)
-----------------------------

[ ] Regular meeting schedule active
    Next meeting: ____________________

[ ] Monitoring established
    [ ] Incident tracking
    [ ] Training compliance
    [ ] Tool usage patterns
    [ ] Regulatory updates

[ ] Review cadence set
    [ ] Quarterly operational review
    [ ] Annual policy review
    Next policy review: ______________

SUCCESS METRICS
---------------

After 6 months, evaluate:
[ ] Policy compliance rate: ___%
[ ] Training completion rate: ___%
[ ] Incidents reported: ___
[ ] Near-misses caught: ___
[ ] Attorney satisfaction: ___/5

IMPLEMENTATION CERTIFIED BY: ___________
DATE: _________________________________

Time investment: 40-80 hours over 12 weeks (firm size dependent) Why it matters: Structured implementation prevents false starts and ensures governance actually functions.


Quick Hits

Governance Trends:

  • 80% of Am Law 100 have AI governance boards
  • Governance increasingly factor in sanctions analysis
  • ABA responsibility transfers to Center for Innovation

Regulatory Developments:

  • 30+ states have issued AI ethics guidance
  • New York CLE requirements now include AI competency
  • EU AI Act (August 2026) driving governance planning

Coming Next Issue:

  • 2026 Predictions: What Mid-Market Lawyers Should Prepare For

Ask the Community

As governance becomes standard of care, we're tracking:

  1. Does your firm have a written AI policy? If yes, how often is it updated?
  2. What's the biggest barrier to implementing governance at your firm?
  3. For firms with governance: What's worked? What hasn't?
  4. Would you use a peer network for sharing governance approaches?

Reply to share. Anonymized contributions welcome.


TwinLadder Weekly | Issue #23 | January 2026

Helping lawyers build AI capability through honest education.


Sources