AI at Work in Australia: Legal Services
An OCNUS Consulting Report – The second in our series on the impact of AI on the Australian workforce.
Executive Summary
Artificial intelligence has moved from novelty to necessity in Australian legal practice, with 71 per cent of lawyers expecting generative AI to be embedded in their daily work within the coming year. This OCNUS Consulting research report reveals that AI is fundamentally rewriting the rules of legal work, creating both unprecedented opportunities and immediate challenges for the profession.
The transformation is already measurable. Twenty-six per cent of Australian law firms run production-grade AI systems, with early adopters reporting time savings of 30–80 per cent on tasks such as contract analysis and discovery research. Platforms like Lexis+ AI and Relativity aiR are cutting review cycles by up to fifty per cent whilst spotting more compliance anomalies than human teams alone.
However, AI adoption brings significant risks. High-profile hallucination incidents have resulted in phantom citations reaching courts and fines of up to US$31,100 imposed globally. Australian regulation is responding rapidly, with three Supreme Court jurisdictions now mandating AI disclosure and the Legal Services Council consulting on technology-competence duties from July 2026.
The economic implications are profound. Fee compression is already underway, with clients reporting twenty per cent savings on routine work. Credible observers anticipate 20–40 per cent price erosion across commoditised legal tasks by 2027. This creates a bifurcated market where routine work faces downward pressure whilst complex strategic counsel commands higher premiums.
Early adopters are establishing compound competitive advantages through better margins that enable further technology investment. The traditional legal market—where relationships and reputation dominated—is becoming performance-driven, where technical capability increasingly determines market position.
OCNUS Consulting identifies three critical success factors: governance frameworks that exceed current regulatory standards, operational transformation that eliminates shadow use and strategic repositioning toward value-based pricing. Firms must address the "missing middle" problem where junior development pathways risk being hollowed out by automation.
The report concludes that lawyers will not vanish but must change. Success requires immediate action across governance, capability building and strategic repositioning. Those that embed responsible AI now will gain margin buffers, brand trust and top talent. Those that delay risk being priced down to process-shop status in an AI-enabled market where clients increasingly expect productivity benefits.
AI at Work in Australia – Legal Services Full Report
In Henry VI, Part Two, when Dick the Butcher says, “The first thing we do, let's kill all the lawyers,” Shakespeare is warning that overturning society begins by removing those who defend order. AI’s challenge to the legal profession is less violent than in Shakespeare’s play, but no less significant. With implications for the whole of society.
The Institute for the Advancement of the American Legal System observes, “Generative AI may be the most impactful paradigm shift that we will experience in our lifetime.” Artificial intelligence is rewriting the rules of legal work and the profession’s guardians face a real and immediate test.
Why this moment matters
Artificial intelligence has moved from novelty to necessity. Twenty-six per cent of Australian law firms already run production-grade AI systems, and a further 45 per cent plan to do so within twelve months, according to Thomson Reuters’ 2025 Generative AI in Professional Services survey. Survey data indicate that informal “shadow use” is common; taken together, 71 per cent of lawyers expect generative AI to be embedded in their daily work in the coming year. Early adopters report time savings of 30–80 per cent on tasks such as discovery research and contract analysis. Even allowing for exaggeration, the impact of AI is significant.
What AI is already doing
Contract risk mapping – platforms such as Lexis+ AI and Luminance cut review cycles by up to fifty per cent and spot more compliance anomalies than human teams alone
Discovery and data extraction – Relativity aiR routinely delivers up to eighty per cent reductions in document-review hours
Legal research and first-draft advice – tools such as Westlaw Precision AI and Harvey can produce usable material significantly faster than manual methods; pilots at major firms report time savings of up to forty per cent
These gains translate into happier clients and stronger retention. In Ashurst’s firm-wide Vox PopulAI trial, eighty-eight per cent of lawyers said AI tools helped them feel better prepared for future practice.
Early adopters are not just implementing efficiency tools—they are restructuring competitive dynamics. Maddocks’ adoption of Thomson Reuters’ CoCounsel Core positions them ahead of peer firms still debating governance frameworks. Top-tier firms with whole-of-practice AI rollouts are establishing service benchmarks that later entrants must meet while benefiting from reduced operational costs.
This creates a compound advantage: better margins enable further technology investment, which drives additional competitive separation. The traditional legal market—where reputation and relationships trumped operational efficiency—is becoming a performance-driven market where technical capability determines market position.
Regulation
Australian regulation of AI in legal practice is progressing faster than many firms realise. Three Supreme Court jurisdictions (New South Wales, Victoria and Queensland) now mandate disclosure of generative AI or restrict its use in evidence, with NSW’s Practice Note SC Gen 23 already the de facto national benchmark. The Federal Court has signalled that a harmonised practice note is under development, but the timing is not yet confirmed. The Legal Services Council is consulting on a “technology-competence” duty that would make inadequate AI supervision a disciplinary offence from 1 July 2026. Universities are next: revised accreditation standards require every LLB to teach responsible AI by 2027, closing the skills gap identified in the attached report.
The statutory layer is building in two waves. Privacy Act amendments before the Senate will force firms, by mid-2026, to explain any automated reasoning that influences client rights. A broader “high-risk AI” bill, flagged for 2026 and targeting legal advice among other sectors, would impose guardrails such as documented risk assessments and human oversight from 2027-28. Together, these measures turn voluntary standards into enforceable duties.
OCNUS Consulting believes it is prudent for firms to adopt NSW-level verification protocols nationwide, start logging AI use matter by matter, and engage early with the 2025-26 consultations to shape rules rather than chase them.
Furthermore, Australia would do well to follow the UK, which utilises formal regulatory sandboxes such as LawtechUK, providing a safe space for businesses to test new legal services with temporary regulatory flexibility and direct guidance. Singapore has created something similar. This would help Australian lawyers proceed with more confidence as firms implement AI tools.
Democratisation of the law
AI has the potential to break down long-standing barriers to access. By reducing cost and complexity, AI could empower more people to engage with the law, particularly those currently excluded from it.
The transformation is already visible. DoNotPay has overturned more than 200,000 parking fines in London and New York and helped refugees with immigration applications. Justice Connect in Australia has developed AI models that strengthen access to legal help and connect people to appropriate support. Where brief consultations once cost hundreds of dollars, AI-powered tools can provide initial guidance for nominal fees or even for free.
However, democratisation brings risks and responsibilities. As more people gain direct access to legal guidance through AI, the legal profession must ensure these tools maintain accuracy and ethical standards.
What could possibly go wrong?
In the 2024 Blackburn Lecture, former President of the Law Council of Australia, Anne Trimmer, emphasised that “lawyers remain responsible for verifying the accuracy of any AI-assisted work product; without proper oversight, the risks are multiplied.” Artificial intelligence can deliver speed and insight, yet its errors can derail a case and a career, and the financial risks are potentially enormous. High-profile mistakes have already been made.
2024 – A Melbourne solicitor used Leap’s drafting tool and filed phantom citations. The Family Court adjourned the matter and referred the lawyer for investigation.
2025 – The UK High Court reprimanded counsel after dummy authorities slipped into judicial-review submissions.
United States – Judges have imposed fines of up to US$31,100 on firms that relied on fabricated precedents generated by chatbots.
Global trend – Since 2023, courts have recorded at least ninety-five hallucination incidents and lawyers are directly linked to more than half of them.
These failures follow similar patterns: time pressure encouraging shortcuts, naïve faith in AI output, leakage of confidential material through public tools, and public embarrassment when errors surface before unsympathetic judges.
What AI means for legal jobs
Legal-services employment has grown five to six per cent per year since 2020, outpacing the general workforce and rival knowledge sectors. Average salaries also beat national inflation, with some firms offering double-digit rises to attract AI-literate talent. The market is rewarding lawyers who combine black-letter mastery with technological fluency.
This growth masks a fundamental shift in what lawyers actually do. AI excels at the pattern-recognition work that traditionally employed junior lawyers: document review, due diligence, contract analysis, etc. These tasks—the apprentice work that builds legal careers—are being systematically automated.
If this trend is not properly planned for, the result will be a “missing middle” problem. Senior partners retain strategic value through judgement, creativity and client relationships. Junior tasks become automated. But the middle-tier development pathway—where technical competence transforms into professional wisdom—could be hollowed out.
The law shares the same AI-driven challenge faced by all areas of knowledge work: How does a profession develop senior-level thinking without junior-level practice? The traditional apprenticeship model that built legal expertise through exposure to routine work will transform just as the profession needs more sophisticated practitioners. The profession must solve this skills pipeline problem before the current generation of senior lawyers retires.
Legal fee compression
Mercutio jibes that Queen Mab “driveth o’er a lawyer’s nose, And then dreams he of fees…” Law has long remained the gold standard for profitability among mainstream professional services in Australia, but AI is changing the economics.
Fee compression is already underway. Clients report a twenty per cent saving on routine work as they build internal AI capabilities. Clio’s 2025 Legal Trends data reveal that three-quarters of billable hours recorded by conventional firms comprise tasks that generative AI can already execute more efficiently.
This will create a bifurcated market. Commoditised work—contracts, document review, standard research—faces downward price pressure as clients demand the AI productivity dividend. Meanwhile, complex strategic counsel that requires creativity, ethical reasoning and nuanced judgment will maintain, or even command, higher premiums.
Credible observers anticipate twenty to forty per cent price erosion across routine work by 2027. Firms that redeploy the liberated hours into higher-value services and embrace value-based pricing will sustain profitability. Those that cling to time-based billing for automatable work will watch both margins and talent depart.
OCNUS Consulting’s strategy for survival and success
The regulatory trajectory is clear and the competitive dynamics are shifting rapidly. Success requires immediate action across three fronts: governance, capability building and strategic repositioning.
For firms
Governance and compliance
Establish an AI steering committee reporting directly to the board with authority to set firm-wide policy.
Implement verification protocols that exceed NSW Practice Note SC Gen 23 standards nationwide, anticipating federal harmonisation.
Begin logging AI use matter by matter now, creating the audit trails that the 2026 Privacy Act amendments will mandate.
Engage proactively with 2025-- 26 regulatory consultations to influence rules rather than react to them.
Operational transformation
Eliminate “shadow use” by providing secure, private AI tools with proper audit capabilities.
Run measured pilots on discrete tasks, establish baselines and scale only where productivity gains are demonstrable.
Redesign workflows to capture AI’s productivity dividend rather than simply maintaining existing processes.
Make AI literacy mandatory across all staff levels, from support teams to senior partners.
Strategic positioning
Shift fee structures from time-based billing to value-based pricing for AI-enhanced services.
Redeploy hours liberated by AI automation into higher-value strategic counsel.
Address the “missing middle” problem by creating new pathways for junior lawyers to develop expertise beyond routine tasks.
Move decisively up the value chain or accept commodity pricing—the middle ground is disappearing.
For regulators and courts
Harmonise AI disclosure requirements across all jurisdictions to reduce compliance friction.
Create regulatory sandboxes similar to LawtechUK, allowing controlled experimentation without jeopardising proceedings.
Invest in judicial AI literacy training so bench and bar share common risk frameworks.
Balance AI’s democratising potential with quality safeguards, particularly for vulnerable users.
For individual practitioners
Treat AI as a research assistant and drafter, never as a decision-maker or final authority.
Verify every citation, precedent and factual assertion—hallucination risks remain significant.
Develop the distinctly human capabilities that AI cannot replicate: ethical reasoning, creative problem solving, empathetic client relations and nuanced judgement.
Embrace AI literacy as a competitive advantage—firms are already paying premiums for technologically fluent lawyers.
The compound advantage
Early adopters are creating self-reinforcing competitive cycles. Better margins from AI efficiency enable further technology investment, which drives additional separation from slower competitors. The traditional legal market—where relationships and reputation dominated—is becoming performance-driven, where technical capability increasingly determines market position.
Firms implementing comprehensive AI strategies now will establish the service benchmarks and cost structures that later entrants must match. Those that delay risk becoming price-takers in an AI-enabled market where clients increasingly expect the productivity benefits and will source them elsewhere if necessary.
Looking ahead
Anne Trimmer concluded her lecture by rejecting both utopian and dystopian narratives. Lawyers will not vanish, but they will have to change. That means new skills, new roles and new forms of accountability. She said, “The future of legal practice won’t be human or machine—it will be both, working side by side.” She calls on the profession to embrace the challenge, shape the tools and keep the law human—by choice, not by default.
Between now and 2028, multimodal models will combine text, audio and video to transform evidence handling and courtroom presentation. Fee pressure on commoditised work may reach twenty per cent as corporate clients build their own copilots. Firms that embed responsible AI now will gain a margin buffer, brand trust and top talent. Those that wait risk being priced down to process-shop status.
The first thing we do is not kill all the lawyers. The first thing we do is equip them with the AI skills to survive and thrive. Shakespeare may have mocked lawyers for dreaming of fees, yet he knew society would collapse without them.