top of page
Search

When AI Fails the Bar: The Growing Professional Risks of AI-Generated Legal Work


ree

Introduction


In a stark warning to the legal profession, Australia recently witnessed a watershed moment: a solicitor was sanctioned for submitting AI-generated, fictitious legal citations to court. This case sends a clear message: AI can enhance and streamline legal workflows but unchecked, it poses serious ethical, reputational, and regulatory risks.


The Landmark Australian Case


In mid-August 2025, the Victorian Legal Services Board issued a rare disciplinary action: it varied the practising certificate of a solicitor found to have submitted AI-generated fake cases to the Federal Circuit and Family Court. The lawyer is no longer permitted to act as a principal, operate a firm, or manage trust funds; instead, he must practice under supervision for two years and submit quarterly reports  .


A Broader Trend and Judicial Alarm


This isn’t an isolated event. In Western Australia, a lawyer was referred to the state’s regulator and fined over A$8,000 after submitting four non-existent case citations produced by AI tools like Claude and Copilot  .


In Victoria, another lawyer admitted that AI-generated submissions contained both fabricated case law and inaccurate quotes from parliament, with the presiding judge warning that AI is unacceptable unless outputs are “independently and thoroughly verified”  .


Regulatory Responses and Ethical Imperatives


Australia’s judiciary is responding decisively. In New South Wales, the Supreme Court has banned AI use in producing key evidentiary documents like affidavits, witness statements, and character references, mandating lawyers certify AI was not used. Similarly, jurisdictions like Victoria and Queensland have moved to restrict AI’s role in critical legal documents.


The Ethical and Professional Bottom Line


These developments underscore that AI is no substitute for legal judgment:


  • The Legal Profession Uniform Law Australian Solicitors’ Conduct Rules 2015 require lawyers to avoid misleading the court and to supervise adequately—duties that AI-assisted shortcuts flout  .

  • Generative AI tools, while promising efficiency, are prone to “hallucinations”—creating authoritative-sounding but entirely fabricated information.

  • Firms and regulators must now build robust AI governance, invest in training, and ensure supervision frameworks include AI oversight


AI holds transformative potential for the legal profession but its careless use can undermine justice itself. As Australia’s recent disciplinary actions exemplify, the message is clear: justice demands more than convenience. It demands accountability.

 
 
 

Comments


bottom of page