Artboard 1Icon/UI/CalendarIcons/Ionic/Social/social-pinterestIcon/UI/Video-outline

Face Off: Lessons from the Bunnings privacy determination

26 November 2024

5 min read

#Data & Privacy

Published by:

Face Off: Lessons from the Bunnings privacy determination

CCTV footage released by Bunnings Group Ltd (Bunnings) showing customers committing violent acts against staff has set up the background for a key privacy determination made this week by the Privacy Commissioner (Commissioner) .

In 2019, partly in response to such incidents, Bunnings introduced facial recognition technology (FRT) in 62 of its stores. The system captured images of store visitors to match them against an internally created database of ‘individuals of concern’.

However, the Commissioner found that Bunnings’ use of the FRT system breached the Privacy Act because they failed to obtain adequate consent from store visitors and was not sufficiently transparent in how the system was implemented. Bunnings has now announced it would appeal (or seek review of) the decision.

This case not only emphasises the obligations of entities collecting biometric information, but also provides broader lessons for employers and retailers looking to use similar technology in workplaces or retail premises.

How Bunnings’ FRT system functioned

Bunning’s FRT system captured biometric data of all individuals entering its stores to address theft, safety concerns and threatening behaviour. When an individual’s facial image did not match any entries in Bunnings’ database of ‘individuals of concern’, the image was automatically deleted within an average of 4.17 milliseconds. If a match was found, an alert was generated within the store and subsequent action could be taken.

Key findings in the investigation

The Commissioner found that Bunnings breached several Australian Privacy Principles (APPs) under the Privacy Act 1988 (Cth) (Privacy Act) through its implementation of the FRT system. We outline these below.

Collection without consent

Bunnings breached APP 3.3 by collecting sensitive biometric information without obtaining valid consent from individuals, and no valid exceptions applied to this collection. Sensitive information under the Privacy Act includes biometric data used for automated recognition. The wholesale collection of this data, without specific notice or consent, was found to be excessive and unnecessary.

Failure to notify

Bunnings was also found to have breached APP 5.1 by failing to adequately notify individuals about the collection of their biometric data. While signage and privacy policies were in place, they did not sufficiently communicate their use of FRT, the purposes for collection or the consequences of refusing to provide biometric data. The decision referenced APP 5.2(b), (d), and (e), which highlights the need for clear, accessible, and transparent communication.

Lack of compliance systems

The Commissioner identified systemic failures to implement practices, procedures and systems to ensure compliance with the APPs, breaching APP 1.2. For example, Bunnings did not conduct a privacy impact assessment (PIA) before implementing the FRT system, nor did it adequately document its privacy governance framework.

Failure to include relevant information in its privacy policy

Although Bunnings updated its privacy policies following the introduction of FRT, the  Commissioner considered it was insufficient as it did not mention biometric information or the use of FRT, as required by APP 1.4(a) and APP 1.4(b).

Implications for organisations

Agencies and organisations using biometric systems on their premises or for employment-related purposes, such as timekeeping, access control or safety monitoring, should be well across the below lessons from this investigation.

Valid consent

Obtaining valid and informed consent before collecting biometric data from individuals is of key importance. Under the Privacy Act, consent is required to be voluntary, informed, current, specific and given with capacity.

In the employment context, the power imbalance between employers and employees raises questions about whether consent can truly be voluntary. The Fair Work Commission (FWC) case Jeremy Lee v Superior Wood Pty Ltd [2019] FWCFB 2946 highlighted this issue, with the FWC finding that the dismissal of an employee for refusing to provide biometric data contravened the Fair Work Act 2009 (Cth).

Necessity and proportionality

The Privacy Act requires entities to only collect personal information that is reasonably necessary for their functions or activities (APP 3.1). There is a need to critically assess whether biometric systems are necessary and proportionate to achieve legitimate business objectives.

Transparency through notification

The Commissioner’s findings in the Bunnings case demonstrates that vague or generic privacy policies and notices are insufficient. Notices must be tailored and specific to the biometric practices in question.

Conduct privacy impact assessments

The lack of a documented PIA appeared to be a factor in the Commissioner’s findings against Bunnings. A PIA helps identify and mitigate privacy risks by assessing the flow of personal information and ensuring compliance with privacy obligations. A PIA is not mandatory for organisations such as Bunnings, but in this case, the Commissioner considered it ‘a reasonable step’ (to reference APP 1.2) to ensure that it complied with the APPs. At a minimum, she said it was reasonable for Bunnings to have conducted a privacy threshold assessment and to document the reasons it believed that a PIA was not necessary in the circumstances.

Proactive governance and monitoring

The Bunnings determination also highlights the importance of embedding privacy protections into governance frameworks.

Balancing privacy and safety

The Commissioner’s determination in this case is an important decision, emphasising the need for transparency, consent and robust governance when collecting sensitive biometric information. As the Commissioner said in her press statement, “just because a technology may be helpful or convenient, does not mean its use is justifiable.”

If you have any questions about your privacy policies or whether your security practices comply with the Privacy Act, please get in touch with our team below.

Disclaimer
The information in this article is of a general nature and is not intended to address the circumstances of any particular individual or entity. Although we endeavour to provide accurate and timely information, we do not guarantee that the information in this article is accurate at the date it is received or that it will continue to be accurate in the future.

Published by:

Share this