Facial recognition and the Privacy Act: a clearer (but stricter) line for businesses

Contacts

jonathon ellis Module
Jonathon Ellis

Partner
Australia

I am an experienced litigation and investigations lawyer based in Sydney, leading Bird & Bird's Australian disputes and investigations practice and co-leading our global Defence and Security practice.

kate morton module
Kate Morton

Special Counsel
Australia

As a Special Counsel in our Tech & Comms, Commercial and Corporate Groups in Sydney, I advise our leading clients on a wide variety of technology, communications and IP law issues, where my experience as a software engineer and consultant gives me additional practical insight.

mia herrman Module
Mia Herrman

Associate
Australia

I am an associate in our Tech Transactions team in Sydney, specialising in technology, cybersecurity and privacy advisory work.

An Australian tribunal has provided clarification of when, and in what circumstances, Australia’s Privacy Act 1988 (Cth) (Privacy Act) applies to facial recognition technology (FRT) used by private organisations in their premises in Australia.

Importantly for businesses operating in Australia the Tribunal accepted that biometric collection without consent may be lawful where an organisation can rely on the “permitted general situation” exception (Permitted Exception). The test is whether the organisation reasonably believes the collection is necessary to prevent serious unlawful activity or address safety risks. This does not require proof that FRT was the only option, but the belief must be objectively supportable and evidence-based.

Bunnings, a major Australian retailer,  succeeded on this point because it could point to serious repeat offending, limits of alternative controls, and a targeted, proportionate system design, including rapid deletion of non-matches and restricted access to watchlists. The Tribunal stressed this was a fact-specific assessment and not a general endorsement of retail facial recognition.

However, the case is a warning that justifying collection is not enough. The Tribunal upheld breaches relating to notice, privacy governance and privacy policy disclosures. Generic “video surveillance” signage was inadequate, biometric use was not properly disclosed, and privacy compliance systems — including a formal privacy impact assessment — were not in place at rollout.

The Tribunal also confirmed that facial images used for automated matching — and the resulting biometric templates — are biometric information (therefore sensitive information, under the Privacy Act). Even very short-lived processing counts as “collection”, meaning FRT will almost always trigger Australian Privacy Principle obligations.

What clients should take from this decision

  • Assume facial recognition involves sensitive biometric collection, even if data is held briefly.
  • If relying on a Permitted Exception, document the necessity analysis and why less intrusive measures are insufficient.  This should be done by way of a privacy impact assessment performed as part of the decision-making process.
  • Build proportionality and safeguards into system design from day one.
  • Do not under-invest in transparency, governance and documentation — failures here will still lead to privacy breaches. In Australia, the OAIC found another retailer’s, Kmart, FRT unlawful, largely because notice, consent and proportionality assessments were not properly built and documented, even though the security objective was legitimate. The lesson is that if you under-resource these foundations — especially for biometrics — you materially increase breach and enforcement risk. Similarly, 7-Eleven’s FRT was inadvertently reactivated for months due to governance and control weaknesses at the vendor/device level, reinforcing that under-resourcing oversight and documented controls can directly lead to biometric privacy breaches.

On a practical note, this decision tells organisations that high-risk security technologies like facial recognition can be used lawfully — but only if you can show your working. That means having a documented necessity case, a recorded proportionality assessment, a completed PIA, clear safeguards (like deletion and access controls), and upfront notice and policy disclosures in place before rollout — not patched in later. If those governance and documentation pieces aren’t properly resourced and evidenced, you may assume the program will fail Privacy Act scrutiny even if the security objective is valid.

Latest insights

More Insights
featured image

Alright, Alright, Alright: Matthew McConaughey Secures Comprehensive Trade Mark Protection of His Persona to Combat AI Deepfakes

4 minutes Feb 05 2026

Read More
featured image

UK GDPR: UK privacy reform is finally going live – what does your business need to do now?

13 minutes Feb 03 2026

Read More
featured image

Operator Insolvencies in the Hotels, Hospitality & Leisure Sector: Consequences, Risks and Opportunities

5 minutes Feb 03 2026

Read More