An Australian tribunal has provided clarification of when, and in what circumstances, Australia’s Privacy Act 1988 (Cth) (Privacy Act) applies to facial recognition technology (FRT) used by private organisations in their premises in Australia.
Importantly for businesses operating in Australia the Tribunal accepted that biometric collection without consent may be lawful where an organisation can rely on the “permitted general situation” exception (Permitted Exception). The test is whether the organisation reasonably believes the collection is necessary to prevent serious unlawful activity or address safety risks. This does not require proof that FRT was the only option, but the belief must be objectively supportable and evidence-based.
Bunnings, a major Australian retailer, succeeded on this point because it could point to serious repeat offending, limits of alternative controls, and a targeted, proportionate system design, including rapid deletion of non-matches and restricted access to watchlists. The Tribunal stressed this was a fact-specific assessment and not a general endorsement of retail facial recognition.
However, the case is a warning that justifying collection is not enough. The Tribunal upheld breaches relating to notice, privacy governance and privacy policy disclosures. Generic “video surveillance” signage was inadequate, biometric use was not properly disclosed, and privacy compliance systems — including a formal privacy impact assessment — were not in place at rollout.
The Tribunal also confirmed that facial images used for automated matching — and the resulting biometric templates — are biometric information (therefore sensitive information, under the Privacy Act). Even very short-lived processing counts as “collection”, meaning FRT will almost always trigger Australian Privacy Principle obligations.
What clients should take from this decision
On a practical note, this decision tells organisations that high-risk security technologies like facial recognition can be used lawfully — but only if you can show your working. That means having a documented necessity case, a recorded proportionality assessment, a completed PIA, clear safeguards (like deletion and access controls), and upfront notice and policy disclosures in place before rollout — not patched in later. If those governance and documentation pieces aren’t properly resourced and evidenced, you may assume the program will fail Privacy Act scrutiny even if the security objective is valid.