In its practice note on the use of generative artificial intelligence (GenAI), the Federal Court of Australia said it has embraced the “beneficial use of technology in proceedings and its wider operations”.
“The court recognises that Generative AI has the potential to facilitate the just resolution of disputes by increasing efficiency in the conduct of litigation, reducing legal costs, enhancing access to justice and the quality of the administration of justice,” Chief Justice Mortimer wrote.
Last March, Chief Justice Mortimer announced the Federal Court was considering the development of guidelines and practice notes and began the process of consulting with the community, the legal profession, academics, legal service providers and technology companies.
In developing its approach, the Chief Justice said the Federal Court focused on balancing the administration of justice with the responsible adoption of emerging technologies, “while maintaining parties’ accountability for all materials that are filed or put before the court”.
The court made it clear that those using GenAI must have a “basic understanding” of its capabilities and limitations, ensure it does not adversely affect the administration of justice, and make disclosures.
Legal and non-legal users should also be aware GenAI can create results that are “not accurate, entirely fictitious or plainly wrong”.
“The presentation of false or inaccurate information to the court is unacceptable. It is inconsistent with the responsibility on all persons to not mislead the court or other parties,” Chief Justice Mortimer stressed.
“It is also likely to frustrate the just resolution of proceedings according to law and as quickly, inexpensively and efficiently as possible.
Documents filed in proceedings must contain the name of the person or lawyer responsible for preparing the documents, who will be expected to confirm the legal authorities exist, the evidence is admissible, and statements about what the evidence proves are findings that are reasonably open for the Federal Court to make.
Witness statements, affidavits and expert reports must also disclose where GenAI has been used. On the latter, Chief Justice Mortimer said experts must only offer their own opinion and process of reasoning.
When it comes to confidential, suppressed or private information, users should carefully consider restrictions before inputting it into any tool.
Chief Justice Mortimer said there may be “serious consequences” for entering this information into GenAI tools, even if sharing that information with others was not intended.
“Where generative AI is used in a way that is inconsistent with this practice note or the court’s orders or directions, all persons should expect that there could be consequences including adverse costs orders and issues as to compliance with legal and professional obligations,” Chief Justice Mortimer added.
The Federal Court has proposed to hold a symposium in the coming months to discuss the challenges and benefits that are likely to arise in proceedings in the context of the new practice note.
This article was originally published on Lawyer's Weekly.
Want to see more stories from trusted news sources?Make Cyber Daily a preferred news source on Google.