AI Disclosure Requirements in Australian Courts & Tribunals
A comprehensive, regularly updated guide to what every Australian court and tribunal requires when AI has been used to prepare legal documents, applications, and submissions.
Fair Work Commission
Mandatory AI Disclosure — Incoming
The Fair Work Commission is about to mandate AI disclosure across all forms and documents. FWC President Justice Adam Hatcher announced this in a presentation to the Victorian Bar Association on 18 February 2026.
The scale of the problem
The FWC's workload has surged dramatically due to AI-assisted applications:
- Total FWC matters projected at 50,000–55,000 for 2025–26, up from approximately 30,000 pre-2023
- Total workload increase of over 70% in three years
- Unfair dismissal applications (s.394) up 41% compared to the three-year average
- General protections dismissal claims (s.365) up 62%
- Other general protections disputes (s.372) up 135%
Source: FWC President's Presentation to Victorian Bar Association, 18 February 2026
The proposed disclosure form
The FWC intends to modify all its forms to include an AI disclosure requirement. The proposed form wording, presented by Justice Hatcher, is shown below.
Use of GenAI
Generative artificial intelligence (GenAI) includes ChatGPT, Claude, CoPilot and Gemini.
If you used GenAI in preparing this application, you must tell us you used GenAI and you must check all details in this application and ensure they are correct and relevant. This includes ensuring that:
- all references to facts or evidence are correct and the facts or evidence exist
- all cases, legislation, textbooks and articles referred to exist and stand for the legal positions attributed to them, and
- all extracts or quotes are correct and are attributed to the right source.
You must also include hyperlinks to all case law referred to in this application.
If you do not do these things, the Commission might dismiss your application or order you to pay costs.
Did you use GenAI in preparing this application?
☐ Yes
☐ No
If you answered Yes — you must check all details in this application and ensure they are correct and relevant, and you must include hyperlinks to all case law. You must also confirm you have done this:
☐ I have checked all details in this application and I confirm they are correct and relevant.
☐ I have included hyperlinks to all case law referred to in this application.
Witness statement templates will also include a disclosure requirement, with a warning that under section 678 of the Fair Work Act, a witness commits an offence if they give sworn or affirmed evidence that is false or misleading.
New South Wales
Mandatory Since February 2025
New South Wales has the most comprehensive AI disclosure requirements of any Australian jurisdiction, with mandatory rules across the Supreme Court, NCAT, Land and Environment Court, and District Court.
Practice Note SC Gen 23 — NSW Supreme Court
Issued 21 November 2024, commenced 3 February 2025. Applies to all proceedings.
Mandatory disclosure
Every affidavit, witness statement, and character reference must contain a disclosure statement about whether generative AI was used in generating its content.
Prohibited uses
Generative AI must not be used to generate the content of affidavits, witness statements, character references, or other material intended to reflect the witness's evidence. Expert reports must not use AI without prior leave of the Court.
Verification obligation
Legal practitioners must verify that every legal citation, case law reference, and legislative reference actually exists and is accurate. Verification cannot be completed by using a generative AI program.
Permitted uses
Generating chronologies, indexes, and witness lists. Preparing briefs or drafting Crown Case Statements. Summarising or reviewing documents and transcripts. Preparing written submissions, where authenticated by the author.
Read Practice Note SC Gen 23 (PDF) →
NCAT — Procedural Direction 7
Effective 7 April 2025. Mirrors the Supreme Court requirements for all tribunal proceedings. NCAT handles over 80,000 applications per year.
Experts seeking to use generative AI must apply for leave of the Tribunal. If leave is granted, experts must maintain records including: prompts used, which parts of the report were AI-generated, software settings, and default values. These records must be included as an annexure to the expert report.
Read NCAT Procedural Direction 7 (PDF) →
Other NSW courts
The Land and Environment Court issued an identical practice note, effective 12 February 2025. The District Court issued General Practice Note 2, effective 18 December 2024. Amendments to the Uniform Civil Procedure Rules 2005 (Rules 31.4 and 35.3B) now formally prohibit AI-generated witness statements and affidavits.
Key case
In Valu v Minister for Immigration and Multicultural Affairs (No 2) [2025] FedCFamC2G 95, a lawyer submitted court documents containing AI-generated, non-existent case citations. The lawyer admitted the material was drafted with ChatGPT and filed without checking accuracy. The lawyer was referred to the NSW Legal Services Commissioner.
South Australia
Mandatory From January 2026
The Judges of the SA Supreme Court approved guidelines for the use of generative AI, adopted from 1 January 2026. The guidelines will be given force under Rules of the Court that apply to practitioners and litigants.
The guidelines apply to proceedings in the Supreme Court, District Court, Magistrates Court, Youth Court, Environment Resources and Development Court, and the Court of Disputed Returns.
The guidelines are comprehensive, covering examples of ethical and responsible use of AI as well as improper use of AI. They encourage practitioners and litigants to consider the use of AI ethically and responsibly.
Queensland
Practice Directions in Force
Supreme Court Practice Direction 5 of 2025, District Court Practice Direction 12 of 2025, and Planning and Environment Court Practice Direction 7 of 2025 are all in identical terms.
These apply to civil and criminal proceedings in all Queensland Courts and Tribunals, including the Supreme Court, District Court, Planning and Environment Court, Magistrates Court, Land Court, Childrens Court, Industrial Court, Queensland Industrial Relations Commission, and QCAT.
A key requirement: a named responsible person must be identified by name at the end of all written submissions. It is not sufficient for a firm of solicitors to be named — an individual legal practitioner must be identified.
Key cases
In LJY v Occupational Therapy Board of Australia [2025] QCAT 96, Deputy President Judge Dann warned that litigants who include non-existent information generated by AI in submissions or other material filed in the Tribunal may face serious consequences.
Victoria
Guidelines Issued
Victoria has issued guidelines rather than mandatory practice notes with the force of court rules. However, non-compliance with guidelines can still have consequences for practitioners.
VCAT issued Practice Note PNVCAT11, which describes acceptable and appropriate use of generative AI in all tribunal proceedings, including when disclosure is required.
The Supreme Court of Victoria issued guidelines for litigants on responsible use of AI in litigation in May 2024. The County Court issued matching guidelines in July 2024.
Key case
In a Victorian Supreme Court proceeding, a solicitor used AI to prepare opening submissions contrary to the Court's published guidelines. Justice Moore identified four authorities that did not exist. The solicitor subsequently lost their principal practising certificate.
Federal Court of Australia
Consultation Underway
The Federal Court has not yet issued formal practice notes on AI use, but has been actively consulting with the profession since early 2025.
In March 2025, the Chief Justice released a statement expecting responsible AI use and disclosure when required by a Judge or Registrar. In April 2025, the Judges announced they would consider developing formal practice notes or rules via an AI Project Group. Consultation submissions closed 13 June 2025.
The Law Council of Australia submitted a comprehensive response recommending a Practice Note rather than just guidelines, a receptive attitude toward AI while addressing risks, and separate guidance for different court users including practitioners, experts, self-represented litigants, and the judiciary.
A formal practice note or rule is expected to be issued in 2026.
Key case
In Luck v Secretary, Services Australia [2025] FCAFC 26, the Full Court redacted a false case citation from reasons for judgment, noting it 'may be a product of hallucination by a large language model' and taking steps to prevent the false information from being propagated further by AI systems.
Privacy Act Reforms
Automated Decision-Making Transparency
The Privacy and Other Legislation Amendment Act 2024, passed in November 2024, introduces new transparency obligations for automated decision-making.
Organisations must disclose in their privacy policies when automated decision-making is used in decisions that could significantly affect individual rights or interests. This includes AI systems used in employment, insurance, credit, and service delivery decisions.
These obligations take effect on 10 December 2026.
This is separate from the court disclosure requirements described above, but creates a parallel compliance obligation for any organisation using AI in decisions about individuals. The Office of the Australian Information Commissioner (OAIC) has been proactive in interpreting the Privacy Act in AI contexts and is actively regulating AI through enforcement rather than waiting for dedicated legislation.
How the EDS Standard helps you comply
The Evidence Disclosure Standard provides the practical process for meeting these requirements — documenting which AI tools were used, what content was AI-generated, what was verified, and creating an audit trail that courts and tribunals can review.