Hong Kong reforms and tips for technology platforms
Data users and platforms are under increasing pressure to moderate content, or at least understand it, to be able to size and manage their risk. In Hong Kong, developments in relation to "doxxing" demonstrate the ongoing evolution in data privacy law in the face of technological change.
In this article, we examine:
- the backdrop to the reforms;
- the Hong Kong changes proposed; and
- how this is likely to impact large technology platforms.
Backdrop – the rise of malicious personal data disclosures
Doxxing – exposing personal information online with malicious intent – has been a strong focus of the Office of the Privacy Commissioner for Personal Data (PCPD) in recent years. Typical examples included social media posts containing photographs and contact details; in some cases additional information such as employment details and family members.
The Office of the Privacy Commissioner for Personal Data (PCPD) received and exposed more than 5,700 doxxing-related complaints between June 2019 and April 2021. On nearly 300 occasions it urged the operators of websites and other online forums to remove around 6,000 hyperlinks – with a success rate of some 70%. Between the same period, the PCPD referred more than 1,460 cases to the Police for criminal investigation involving suspected contravention of Section 64(2) of the Personal Data (Privacy) Ordinance (PDPO).
Government law makers understandably wish to respond with an appropriate piece of criminal legislation. Care is also needed when considering the right legislative and enforcement structure to engender trust, effectiveness and systemic integrity.
In the meantime, various injunctions have been sought (and granted) to help limit the spread of the personal data of judicial and police officers.
How Hong Kong's privacy laws have evolved… but need to keep pace
Section 64 of the PDPO started life before doxxing was on anyone's radar. Even before the Summer 2010 furore over the selling of personal data from a ubiquitous Reward Points programme, which led to the 2012 revision of PDPO being somewhat diverted to accommodate new rules concerning direct marketing, the Consultation Documents on the Review of the PDPO of the previous year was grappling with specific behaviours, including:
- an employee of a company (a data user) obtains personal data of the company's customers without the consent of the company for sale to third parties such as direct marketing companies, debt collection agents, etc. for profit; and
- an employee of a hospital (a data user) obtains sensitive health records of patients without the consent of the hospital for sale to third parties.
The concern was that people with access to personal data by reason of their position (especially at work) might misuse it for personal gain or to cause loss to the employer. This led to Section 64(1), which has no utility against doxxing. In case the element of gain could not be shown, the 2012 revisions included Section 64(2) – the same idea but requiring psychological harm.
The section reads:
"64(2) A person commits an offence if —
- the person discloses any personal data of a data subject which was obtained from a data use without the data user's consent; and
- the disclosure causes psychological harm to the data subject."
Although tantalisingly close to addressing doxxing, the existing Section 64(2) is written to preserve the integrity of data held by "data users" (or data processors in international parlance). Those involved in doxxing using whatever Internet resources come to hand usually do not leave a trail showing which data user they obtained the relevant data from, and the question of such data user's consent (or lack of it) is both hard to establish and somewhat beside the point.
Furthermore, multiple exemptions muddy the waters in practical scenarios and make prosecution challenging – for example, a discloser can escape liability where they "reasonably believe[d] the disclosure was necessary for the purposes of preventing or detecting crime" or "had reasonable grounds to believe that the publishing or broadcasting of the personal data was in the public interest".
On top of this, harm continues where publicity continues – making "takedown" powers especially important.
In this regard, the PCPD complains that it is left to beg favours from internet platforms to combat doxxers, and – as it has complained in the past – as it has no formal investigative or prosecutorial powers, it is hampered in providing the most effective redress for victims of doxxing. Of course, some might argue that the PCPD's success rate with voluntary takedown requests demonstrates the willingness of industry to limit harm, as well as the role that contractual "terms of use" imposed by large platforms can play in regulating behaviour in these privately operated (albeit "public") fora.
Proposals for reform
And so it came to pass that the Secretary for Constitutional and Mainland Affairs recently proposed changes to Section 64 to the effect that a person will be guilty of doxxing when:
- they disclose any personal data of a data subject without consent;
- the act was committed:
- with the intent to threaten, intimidate or harass the data subject or any immediate family member; or
- with the purpose to cause psychological harm to the data subject or any immediate family member; and
- the disclosure cases psychological harm to the data subject or any immediate family member.
Some commentators have noted that this could be too high a threshold, requiring proof of both intention AND effect. They noted that the offence of criminal intimidation does not require prosecutors to prove the effect of the intimidation. The proposed penalties for the crime of doxxing would be fines of HK$1 million and five years imprisonment (the same as under the existing Section 64). However, sentencing policy for other breaches of PDPO has rarely involved penalties greater than fines of HK$10,000 to HK$20,000.
Multilateralism – could it help?
The complexities of data law are such that different jurisdictions are prioritising different behaviours and have differing policy priorities, such as consumer rights to transfer data, treatment of sensitive data and the right to be forgotten . In Hong Kong, there has been less debate about these and other data law topics that have previously been put forward by the PCPD (see our February 2020 alert about those proposals available here: https://www.kwm.com/en/hk/knowledge/insights/legco-reviews-pdpo-20200203). Over the past decade many more regional law districts have enacted data laws. Whilst having a data protection law is a sine qua non for a law district to achieve a joined up approach to data law with other nations, the prospect of a multilateral approach seems remote.
In addition, material philosophical differences regarding data use, data regulation and content moderation are emerging that make convergence of policy approaches challenging.
At the same time, alignment on certain broad areas would be beneficial and in principle achievable. There is also an opportunity to re-conceptualise data regulation drawing from the lessons learned of regulating other sectors that relate to sharing value (namely, financial services) – for example, will we see higher expectations of privacy officers akin to "senior manager" and "manager-in-charge" regimes in the banking and securities sectors? Will the collection, storage, control and use of data better align with regimes that only permit "fit and proper" licensed persons with sufficient capitalisation to handle something of value? After all, there is a rapidly expanding gulf in regulatory treatment between someone who holds some cash on deposit, and another who holds someone's biometrics. Already some regimes (such as Australia's Consumer Data Right) are approaching the topic far more innovatively. More of this will come.
Key considerations for technology platforms
The reforms in Hong Kong are part of a continuum of legal and regulatory frameworks grappling with new technologies, relationships and risks.
For technology providers, key considerations and practical measures include:
- Understanding the jurisdictions that require proactive versus reactive content moderation, and the circumstances in which such requirements arise.
- Assessing the degree to which moderation is technically achievable.
- Ensuring contractual terms are sufficiently flexible and adequately specific to accommodate requests by regulatory authorities to remove content on request.
- Designing appropriate internal review mechanisms and related policies and procedures that are clear as to which.
- Considering engagement in transnational and local policy discussions on further developments.
- Ensuring appropriate reviews of novel features – for example, blockchain adds interesting opportunities such as non-fungible tokens (NFTs) that represent content, but requires a sophisticated approach to compliance when decentralisation and immutability are in play.
- Consider the impact of changes in indirect regulation - that is, standards that apply to your clients, particularly if they are financial institutions.
We are working closely with multiple technology platforms on their day-to-day compliance, dealing with regulatory requests and launching new products and services. Please contact us if we can support you.