TWINLADDER
TwinLadder
TWINLADDER
Back to Archive
TwinLadder Intelligence
Issue #14

TwinLadder Weekly

August 2025

TwinLadder Weekly

Issue #14 | August 2025


Editor's Note

This is our fourth Harvey-centric issue. I am conscious of that, and I have considered whether we are giving one company disproportionate attention. My conclusion: the adoption data justifies it. Not because Harvey is necessarily the best tool — I remain agnostic on that — but because Harvey's adoption curve is the most detailed dataset we have on how large law firms actually use AI.

We have covered the funding, the valuation, and the multi-model strategy. I am not going to revisit those. What I want to focus on today is the data behind 42% AmLaw 100 adoption: what those firms are doing, what the remaining 58% are thinking, and what the adoption pattern tells us about how AI is actually changing — and not changing — legal practice.

From where I sit in Europe, the American adoption numbers are simultaneously instructive and misleading. They tell us what AI can do for firms with deep pockets and dedicated innovation teams. They tell us very little about what AI means for the 80-lawyer firm in Brussels, the boutique in Milan, or the regional practice in Riga. The numbers paint a more nuanced picture than either the evangelists or the sceptics suggest.


[HIGH CONFIDENCE]

Inside the 42%: What Firms Are Actually Doing With Harvey

The RSGI/Harvey adoption report provides the most granular data yet on real-world legal AI usage. Let me cut through the headline metrics to what actually matters.

The adoption numbers are real and accelerating. Harvey reached $100M ARR in August 2025, three years after founding. 500+ customers across 54 countries, with 5.5x growth in monthly queries year-over-year and 36x growth in active files. By late 2025, 700+ customers and 50%+ of AmLaw 100. For legal technology, where deployment cycles typically run in years, this is genuinely unprecedented.

But the more interesting data is how firms are using it, because the pattern tells us where AI works and where it does not.

Metric Number What It Actually Means
$100M ARR in 3 years Fastest legal tech revenue growth ever Market willingness to pay is proven — at the top
42% → 50%+ AmLaw 100 adoption Half of elite US firms are customers The other half have reasons worth understanding
5-8 hours reclaimed per user per week ~12% of billable week Meaningful but not transformative
5.5x query growth YoY Genuine usage, not shelfware Per-lawyer engagement rate remains unpublished
25 mentions for transaction work Dominant use case Low-judgment, high-volume tasks lead adoption

Transaction work dominates with 25 mentions in the RSGI report — drafting, due diligence, deal management. Litigation follows at 22 mentions — research, case management, discovery review. IP and real estate trail significantly. The concentration makes sense: transactional and litigation practices have the highest volume of document-intensive, pattern-recognisable work.

The reported time savings are meaningful but bounded. Deutsche Telekom estimates five hours reclaimed per user per week. Adecco Group reports up to eight hours per week. These are significant but not transformative. Five hours is one fewer late night, not a restructuring of the practice model. For context, that is roughly 12% of a billable week — useful, but not the "10x productivity" that AI marketing suggests.

The tasks AI handles are consistently the low-judgment, high-volume ones: first drafts, document triage, research starting points, contract clause extraction. The tasks humans retain are consistently the high-judgment ones: negotiation strategy, client relationships, materiality assessment, courtroom advocacy. This is not a temporary limitation. It is a structural one. LLMs are good at pattern-matching and text generation. They are a long way from reasoning well enough to replace lawyers on matters requiring judgment.

What the 58% non-adopters are thinking. I have spoken to managing partners and CIOs at a dozen AmLaw 200 firms that have not adopted Harvey. The reasons cluster into five categories:

Pricing. One firm was quoted over GBP 200 per lawyer for a major AI platform; after one email, the price dropped 60%. The market has not settled. Firms that wait may get better terms.

Build-versus-buy. Some firms are developing internal capabilities for data control, customisation, and competitive differentiation. Whether this works remains unproven, but the rationale is sound for firms with the technical capacity.

Change management. The technology is the easy part. Training hundreds of lawyers, changing established workflows, managing ethical compliance, updating billing practices, and convincing sceptical partners — that is the real barrier. One firm told me they are "three years into transformation with maybe 30% internal adoption."

Wait-and-see. The landscape changes monthly. Firms that adopted legal research platforms early sometimes regretted vendor choices. Waiting for consolidation is not irrational.

Genuine scepticism. Some senior lawyers remain unconvinced that current AI delivers sufficient accuracy for high-stakes work. They are not wrong about the accuracy limitations — the Stanford hallucination study confirms significant error rates even in purpose-built tools. They may be wrong about the trajectory.

The European dimension matters here. The EU AI Act adds a layer that American adoption data does not capture. European firms deploying Harvey or any AI tool must now consider Article 4 compliance — documented AI literacy for staff using these systems. That is not optional. It entered force in February 2025. A firm that deploys Harvey without an accompanying training and governance framework is not just underperforming on adoption. It is non-compliant.

The adoption story's real lesson is not about Harvey specifically. It is that the question has shifted from "whether" to "how". Every firm needs an AI strategy, even if that strategy is deliberate, reasoned delay. What no firm can afford is no strategy at all.


The Competence Question

A managing partner at an AmLaw 200 firm described his firm's Harvey adoption to me as follows: "We deployed Harvey across the litigation group. Usage dropped 70% after the first month. Associates tried it, got mixed results, went back to Westlaw."

This is the pattern nobody talks about at legal tech conferences. Adoption metrics measure accounts, not habits. A firm can be a Harvey customer with 500 seats and have 50 regular users. The RSGI report shows 5.5x growth in queries — but from what base, across how many licensed users? The per-lawyer engagement rate is the metric that matters, and it is the metric nobody publishes.

The firms where AI adoption sticks are the ones that invest in training beyond a one-hour webinar. They assign power users in each practice group. They share workflows that work. They accept that AI competence, like legal competence, develops over months of practice, not from a deployment memo.

Can you, personally, craft a prompt that produces genuinely useful output for your specific practice area? Can you evaluate whether the AI's analysis missed something important? Can you explain to a client what the AI contributed to their matter and what you contributed?

If not, your firm may be an AI adopter. But you are not yet an AI-competent lawyer. The licence does not transfer.


What To Do

  1. Measure actual usage, not licensed seats. If your firm has deployed an AI platform, check your per-lawyer monthly query rate. If adoption has plateaued below 30%, you have a training problem, not a technology problem.

  2. Identify your practice-specific use cases. Do not try to "use AI for everything." Select two or three specific tasks where AI consistently adds value in your practice area — first-draft research memos, contract clause identification, regulatory summaries — and build proficiency there.

  3. Develop an AI strategy even if you do not adopt yet. Document why you are waiting, what criteria would trigger adoption, and what you are doing in the interim to maintain competitive position. "We have not decided" is not a strategy.

  4. Watch the ROI data, not the headline metrics. Time savings only matter if they translate to value — through more work, better work, or lower cost to clients. Ask firms that have adopted: has it improved profitability, or just changed where the hours go?

  5. Talk to your clients. Clients are increasingly expecting AI-level efficiency. Whether you adopt Harvey or a competitor or build internally, understand what your clients expect and have an answer ready. In Europe, where the AI Act creates mutual compliance obligations between firms and their clients, this conversation is not optional.


Quick Reads


One Question

If Harvey's adoption data shows that AI excels at drafting, triage, and research — the tasks that train junior lawyers — what develops the next generation of senior lawyers who need those skills most?


TwinLadder Weekly | Issue #14 | August 2025

Compliance is the floor. Competence is the mission.