Tell me in a minute
The Government has (at last) introduced the first tranche of long-anticipated privacy reforms. While the substantive elements of this tranche of reforms are somewhat underwhelming, there are still some important aspects to be aware of, including new provisions that will:
- clarify the operation of information security obligations;
- streamline overseas data flows;
- require enhanced transparency where personal information is used for automated decision-making that may significantly affect the rights or interests of an individual; and
- implement new ‘doxxing’ offences for the distribution of personal data in way that would be regarded as menacing or harassing an individual or group.
In this first alert in our series of deep dives into the privacy reforms, we explore these provisions to draw out the key points that businesses should be across. You can also read our general overview of the privacy reforms here.
A modest first tranche of privacy reforms
On 12 September 2024, the Government introduced hotly-anticipated privacy reforms in the shape of the Privacy and Other Legislation Amendment Bill 2024. The Bill has been widely characterised as anticlimactic, as the scope of reforms that it will enact is relatively limited compared to what many had anticipated based on the extended consultation process that preceded it. The Government has promised that the Bill only contains the first tranche of reforms and that additional more substantive reforms are to follow in a second tranche, but there will likely be a Federal election to navigate before that can happen. In the meantime, we have been busy dissecting the impact of the Bill as introduced.
Much of the commentary on the Bill to date has rightly focussed on the broader enforcement powers that it will confer on the Australian Information Commissioner and the new statutory tort for serious invasions of privacy that it will introduce. KWM’s expert litigators are separately preparing a detailed analysis of those aspects of the Bill, which will be available shortly. A number of other important features of the Bill – such as the requirement for the Australian Information Commissioner to develop a new Children’s Online Privacy Code and the power for Government to issue declarations to facilitate data sharing where required to effectively respond to a data breach – are contingent on further action being taken.
However, there are other substantive changes in the Bill that will have an immediate impact on businesses from a compliance perspective, and deserve detailed consideration. We have set out our thoughts on these areas for businesses to be aware of now in this alert.
Security, retention and destruction of information – Changes to APP 11
Under APP 11, entities are currently obliged to take ‘such steps as are reasonable in the circumstances’ to protect the personal information they hold from misuse, interference and loss and from unauthorised access, modification or disclosure.
The Bill will clarify that the reasonable steps required under APP 11 includes both technical and organisational measures. The Explanatory Memorandum explains that:
- technical measures may include ‘physical measures, and software and hardware – for example through securing access to premises, encrypting data, anti-virus software and strong passwords’; and
- organisational measures may include ‘training employees on data protection, and developing standard operating procedures and policies for securing personal information’.
This aspect of the Bill is, in reality, more in the nature of a clarification than an amendment. It provides no greater specificity as to the reasonable steps that may be required to protect personal information in any given circumstance, but does make absolutely clear that both technical and organisational steps should be considered. The OAIC guidance on APP 11 already identifies that reasonable steps may include both those of a technical nature (examples given include ‘ICT security’, ‘access security’ and ‘physical security’) as well as those of an organisational nature (examples given include ‘governance, culture and training’ and ‘internal practices, procedures and systems’). Nonetheless, the clarification reflected in the Bill serves as a good reminder that businesses should take a holistic view of information security and should not adopt a narrow view that it is a technical problem to be solved by boffins alone. Indeed, notifiable data breach reports published by the OAIC consistently show that a material proportion of notifiable data breaches are attributed to human errors, rather than system faults or malicious attacks.
We think that this is a timely reminder for organisations to revisit their privacy and cyber security governance and reporting frameworks as a stronger organisational focus on privacy and cyber security, rather than blind faith in technical controls, will provide the strongest protection for Boards and organisations against the risk of breaching APP 11.
Overseas data flows – Changes to APP 8
The Bill will establish a new regulation-making power under which the Government may prescribe the laws of a country or binding scheme that protects personal information in a way that, overall, is at least substantially similar to the Australian Privacy Principles. Companies bound by the Privacy Act will then be able to disclose personal information to an overseas recipient who is subject to the prescribed law or binding scheme without having to take any further steps under APP 8.1 to ensure that the recipient will manage the information in a manner consistent with the APPs.
As explained in the Explanatory Memorandum for the Bill, the purpose of this amendment is to reduce the burden on regulated entities (so that they do not have to make their own assessments as to the level of protection offered by overseas laws compared to the APPs – a complex task that, in our experience, most Australian businesses have been reluctant to undertake) and to ‘help establish Australia as a trusted trading partner and support Australian businesses to compete more effectively in international markets’.
There are a couple of points worth making about this welcome development:
- While it will help to support the flow of data from Australia – it will not necessarily facilitate the flow of data to Much has previously been written about the fact that the Australian Privacy Act has not been recognised as providing an ‘adequate’ level of protection by the standards of European data protection laws (because of, amongst other things, exemptions allowed for small businesses and employee records). The changes contemplated by the Bill will not change that position. While certainly helpful, the Bill will not on its own ‘solve’ the challenges surrounding the flows of data to and from Australia that are required to power our increasingly borderless digital economy. Other mechanisms to enable data flows will still be required.
- In practice, the existing Privacy Act does not present a great barrier to Australian entities disclosing personal information overseas.
- Under APP 8.1, such disclosures are permitted as long as the discloser takes reasonable steps to ensure that the overseas recipient handles the information in a way that is consistent with the APPs. The usual way of meeting this requirement is to enter a binding contract with the recipient under which they make undertakings as to how they will treat the information they receive. These contractual terms are typically quite straightforward, not difficult to negotiate, and provide commercial value from a risk allocation perspective even where not strictly required by law.
- However, there is a drawback to relying on APP 8.1 - when dealing with an overseas recipient who is not directly bound by the APPs (i.e. because they are not carrying on business in Australia), the discloser will under section 16C of the Privacy Act remain liable in Australia for any act or practice of the recipient that would breach the APPs. In other words, while the information may be out of their hands, the discloser will remain accountable in Australia for the recipient’s conduct.
- If the Bill is enacted, an entity in Australia will be able to disclose personal information to an overseas recipient who is subject to a prescribed law or binding scheme without having to worry about APP 8.1 and section 16C and the associated threat of being liable for the recipient’s actions hanging over their head.
- A related benefit of not having to rely upon APP 8.1 to support an overseas disclosure is that the discloser will not be obliged to comply with the data breach reporting regime under Part IIIC of the Privacy Act where the overseas recipient experiences an eligible data breach. Under section 26WC(1) of the Privacy Act, if an entity has disclosed information to an overseas recipient under APP 8.1, the discloser will remain directly accountable for data breach reporting obligations. If the Bill is enacted, disclosing entities will be freed of this responsibility when disclosing information to an overseas recipient who is subject to a prescribed law or binding scheme. Having said that, depending on the nature of the disclosure, in many circumstances, disclosing entities may prefer to control the notification of breaches to affected individuals and regulators, rather than leaving it up to an overseas entity to do so.
Of course, none of this will mean much unless the Government actually exercises its power to prescribe laws and/or schemes that are to be taken as providing protection that is substantially similar to the APPs. It is to be hoped that the Government will be proactive and prompt in exercising this power. Given the wide recognition of the GDPR as an international benchmark for data protection, we would hope that Europe and the UK would be close to the front of the queue (possibly along with other jurisdictions recognised by the European Commission as providing an adequate level of data protection by EU standard) – Japan, NZ and South Korea, amongst others falling into that category), and that other major trading partners of Australia with well established privacy laws will also be considered.
Enhanced transparency for automated decision-making – Changes to APP 1
The Bill will require additional transparency to be provided by organisations who use personal information for making decisions by an automated process that could reasonably be expected to significantly affect the rights or interests of the individual in question. Specifically, the Bill will require that privacy policies disclose:
- the kinds of information used for that purpose;
- the kinds of decisions made ‘solely’ by automated means; and
- the kinds of decisions for which a thing that is ‘substantially and directly’ related to making the decision is done by automated means.
To comply with this requirement, an organisation will need to review all decision-making processes that rely on personal information and assess the extent to which they relate to decisions that both could reasonably be expected to significantly affect the rights or interests of an individual and depend solely or substantially on automated means. The organisation will then need to ensure that adequate information about these decision-making processes is included in its privacy policy. The fact that the Bill contemplates a 2 year implementation period for this change reflects the substantial amount of work that this may involve. These are some of the complexities that will need to be worked through:
As to whether a decision could reasonably be expected to significantly affect the rights or interests of an individual
The Explanatory Memorandum for the Bill explains that whether or not a decision can reasonably be expected to affect the rights or interests of an individual will depend on the relevant circumstances. The assessment could, amongst other things, be affected by the nature of the individual (e.g. if they are a child or have another vulnerability) as well as the nature of the decision and the rights it affects.
Examples given in the Bill include decisions to grant or refuse a benefit (such as a right to enter the country or to receive a housing benefit), decisions affecting contractual rights (such as under an insurance policy), and decisions that affect access to a significant service or support (such as access to healthcare services). Notably, the Explanatory Memorandum suggests that in some circumstances decisions about targeted content and advertising may also be considered to have a significant affect on individuals, such as where it may result in them receiving differential pricing for the provision of, or access to, significant goods or services or it may limit their access to job opportunities (say as a result of filters applied by recruiters).
It is clear that the requirements will apply quite broadly and many decisions may be captured.
As to whether a decision-making process depends solely or substantially on automated means
Despite advances in technology, it may still be relatively rare for important decisions to be made solely by automated means – a human will often be in the loop at some point in the decision-making process, as a check and balance if nothing else. However, the enhanced transparency requirements will also apply where automation is used ‘substantially and directly’ to make the decision, so these requirements cannot be avoided simply by building in some incidental human involvement.
The Explanatory Memorandum makes clear that the reference to ‘substantially and directly’ sets two different threshold requirements:
- ‘substantially’ means that the automated aspect must be a key factor in facilitating the human’s decision making; and
- ‘directly’ means that the automated aspect must have a direct connection with making the decision.
As an example, the Explanatory Memorandum states that if a spreadsheet program is used to calculate a total sum that is used as one input in a decision-making process, then that may satisfy the ‘directly’ element but not necessarily the ‘substantially’ element. On the other hand, if the spreadsheet program is used to generate a ‘score’ about an individual that is relied upon as a key factor in the decision-making process then the ‘substantially’ element would also be satisfied.
The Explanatory Memorandum also makes clear that incidental use of software, such as a word processing program to record a decision, will not satisfy either requirement. Nonetheless, this requirement is obviously designed to operate broadly and it may capture many decision-making processes even where the ultimate decision-maker is a human, simply because in making the decision the human has relied upon guidance or reference points generated by automatic means.
Proposal not implemented in the Bill – An individual’s right to request meaningful information about how significant automated decisions are made
Interestingly, while this part of the Bill does give effect to two reform proposals that were agreed by the Government in its response to the Privacy Act Review Report, there is a third related proposal that was also agreed but does not feature in the Bill. This is the proposal to introduce a right for individuals to request ‘meaningful information’ about how significant automated decisions are made. The Privacy Act Review Report indicated that this could enable individuals to obtain more tailored information than may be available in privacy policies and similar public documents, including potentially an explanation of how a specific decision was reached. The underlying policy rationale behind this as to ‘ensure individuals have sufficient understanding about the rationale for automated decisions to enable them to exercise other rights, either under privacy law, such as the right to object, or other frameworks such as administrative or discrimination law.’ However, the Report went on to indicate that this objective could be pursued as part of broader work to regulate AI and similar automation technologies, which may explain why it does not feature in the Bill.
Notably, in the week before the Bill was introduced, the Government made two AI-related announcements:
- one launching a Voluntary AI Safety Standard based on 10 ‘AI guardrails’ to help organisations develop and deploy AI systems in a safe and reliable way now; and
- another commencing a consultation on a framework to introduce mandatory guardrails for AI in high-risk settings largely aligned with the Voluntary AI Safety Standard.
We have prepared a deep dive into the mandatory guardrails here.
In each case, there are guardrails focusing on transparency and contestability, so that end-users are informed about AI-enabled decisions and can contest decision-making outcomes. It is not yet clear what level of detail would need to be provided about individual decisions to comply with the guardrails and allow an adequate level of contestability or whether this would rise to the level of transparency contemplated in the Privacy Act Review Report. This is an area that businesses using AI will need to keep a close eye on – as we have previously written on, explainability is a key issue when deploying an AI system and the ‘black box’ problem (where an AI engine may considered to operate as a ‘black box’ because of the difficulty in explaining exactly how it arrived at a given prediction or decision) can be a difficult one to solve depending on the nature of the AI technology being used.
Visit the KWM map of AI regulation in Australia to stay across the different policy positions of the key players in the AI landscape.
Doxxing - New criminal offences under the Commonwealth Criminal Code
The Bill will introduce into the Commonwealth Criminal Code two new offences for ‘doxxing’ activities.
These offences will apply where a person uses a carriage service to make available, publish or otherwise distribute personal data (e.g. a name, photograph, contact information, or a work, education or worship location) about a person or a member of a group and in doing so engages in conduct that a reasonable person would regard as being menacing or harassing. The base elements of the two offences are closely aligned, with the second offence distinguished by an additional aggravating factor in that the conduct must be carried out in whole or in part because of a belief that a group is distinguished by race, religion, sex, sexual orientation, gender identity, intersex status, disability, nationality or national or ethnic origin.
These are serious offences. The first offence is punishable by imprisonment for up to 6 years while the second offence, given the additional aggravating factors, is punishable by imprisonment for up to 7 years.
Notes in the Bill provide the following examples of the type of conduct that is being targeted:
- for the first offence: publishing the name, image and telephone number of an individual on a website and encouraging others to repeatedly contact the individual with violent or threatening messages; and
- for the second offence: publishing the names, images and residential addresses of members of a private online religious discussion group across multiple websites and encouraging others to attend those addresses and block entryways, or otherwise harass the members of that group.
These new offences were introduced in part in response to a recent incident where private details from a WhatsApp discussion group (including names, photographs and contact details) of Jewish community members was released online without permission in a way that encouraged targeting of those individuals.
This development may have little relevance to most businesses, given they will naturally not ever be directly involved in doxxing activities. However, organisations that play an intermediary role in enabling the distribution of information online may be incidentally impacted. The Explanatory Memorandum for the Bill makes clear that the concepts of ‘make available, publish or otherwise distribute’ are intended to be applied broadly. As such, the operator of a website on which personal data is posted so that it can be accessed by other users may well be considered to be involved in making available, publishing or distributing the data in question, even if they were not involved in deciding the content of the post.
While section 473.5 of the Commonwealth Criminal Code makes clear that a person will be taken not to be ‘using a carriage service’ by engaging in conduct solely in their capacity as a carrier, carriage service provider, ISP or hosting service provider, there is no similar protection for website operators. Of course, the new doxxing offences will only apply if a reasonable person would regard the offending conduct as being menacing or harassing towards an impacted individual or group. In the ordinary course, this may not extend to a website operator who plays no active role in deciding what content a user posts on the website. However, that may change if the website operator is notified of doxxing material on the website and refuses to act. In that case, it is not beyond the realms of possibility that the website operator could be considered to have endorsed the content or otherwise played a role in distributing the content that could be considered menacing and harassing for the victims.
Even if that was not the case, accessorial liability may apply under the Commonwealth Criminal Code where a person’s conduct actually aids or abets an offence that has been committed and they intended or were reckless as to whether that would be the case. While the natural targets of any prosecution would be the users responsible for posting the doxxing material, website operators and other intermediaries should be wary of how they may be caught up in doxxing attacks, and should be prepared to swiftly remove offending material if necessary.
What’s next?
The Government has indicated that it is working on the next tranche of privacy reforms, and that there will be further targeted consultation with business groups and other stakeholders in due course. Indicators are that this time the consultation may involve sharing of exposure draft legislation before reforms are introduced to Parliament. Naturally, we will be following along closely to keep you up to date. In the meantime, you can always stay on top of the latest regulatory developments through the KWM Tech Regulation Tracker.
Be prepared and get your data house in order with our insights on Australia's privacy reforms.