A recent Queensland body corporate dispute — involving a client of Hartley’s — is believed to have set a legal precedent after an adjudicator publicly questioned the apparent use of artificial intelligence in the preparation of submitted material.
While the specifics of the application have already been widely reported, one aspect of the ruling has drawn particular attention: an adjudicator’s criticism of what is thought to be the first instance of suspected AI-generated content influencing a strata decision.
In her judgment, Adjudicator Ingrid Rosemann noted that the applicant repeatedly relied on incorrect or irrelevant legislative provisions and misstated laws pertinent to the case. “The applicant cites quotes and reports that do not appear to exist or, if they do, the applicant has not produced them despite my requests,” she wrote. “The applicant cited cases as precedent for various propositions, but those cases either do not exist or have little or no relevance to this dispute. If the applicant used AI or other sources in preparing the material they submitted, I am not satisfied they checked the accuracy of the information obtained.”
The ruling underscores a growing concern within the strata sector — one that likely extends far beyond it — regarding the rising volume of AI-generated correspondence, complaints, and submissions. Such material is often lengthy, produced within seconds, and frequently riddled with errors, misunderstandings of legislation, or arguments based on fabricated or misinterpreted information.
According to Hartley’s, AI-generated documents are appearing with increasing regularity, and they are usually easy to identify. While the firm acknowledges the significant benefits of AI in day-to-day operations, it emphasises that the responsibility for accuracy remains entirely with the person submitting the material. Documents may appear polished and full of “legalese”, but the underlying reasoning may still be deeply flawed — with potentially costly consequences. As the recent Sky Gardens matter demonstrated, such missteps can result in penalties, including financial ones, reportedly up to $2,000 in this case.
The practical burden on strata managers is also a point of concern. Time-pressed professionals are being forced to spend hours reviewing extensive submissions generated at the click of a button, diverting attention away from essential duties and diminishing their capacity to serve clients effectively. In an already time-constrained industry facing rising service expectations, this strain is becoming increasingly unsustainable.
Although the decision is unlikely to bring about immediate systemic change, it may serve as an important catalyst. By drawing attention to the challenges posed by unchecked or excessive AI use, it could pave the way for clearer guidelines, expectations, or precedents on how the strata sector — and potentially other industries — should handle large volumes of AI-generated material moving forward.