You’re visiting the supermarket to buy milk. You observe the freshness of the flowers, fruit and vegetables upon entering. You pick up on the aroma of the bakery and treat yourself to a croissant, as well as a soft drink from the fridges. You pass and pick up sale items from the clothing and electronics goods aisles. You finally reach the milk at the back and head to the checkouts, where you see some chocolate, crisps and magazines. You decide to treat yourselves to those as well.
It’s only when you leave that you realise that you’ve spent far more than you were intending. Why? You’ve fallen victim to the so-called ‘dark pattern’ of the supermarket layout leading you through the enticing non-essential items to get to what you actually need.
In the context of data protection, it is equally tempting for organisations to implement dark patterns, except in this context there are far reaching legal, as well as ethical, consequences.
What are dark patterns?
Dark patterns are not defined in the UK / EU General Data Protection Regulation (GDPR). However, the California Privacy Rights Act (CPRA) defines a dark pattern as: “… a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.”
In other words, dark patterns are elements of design which nudge users into making uninformed choices about their personal data which they do not intend, typically to their own detriment and to the benefit of the organisation.
Dark patterns are predominantly discussed in respect of electronic interfaces, but they can also occur within the hard copy equivalent, for example paper data collection forms.
Why are dark patterns a data protection concern?
Where organisations rely upon the data subject’s consent as a lawful basis for processing personal data under Article 6(1)(a) of the GDPR, they must ensure that such consent is: “… any freely given, specific, informed and unambiguous indication of the data subject’s wishes by which he or she, by a statement or by a clear affirmative action, signifies agreement to the processing of personal data relating to him or her …”
Consequently, as per the 2019 Report “Shaping Choices in the Digital World” issued by the French data protection authority (CNIL): “… the fact of using and abusing a strategy to divert attention or dark patterns can lead to invalidating consent” under the GDPR. The CPRA, which is heavily influenced by the GDPR, explicitly outlines that: “… agreement obtained through use of dark patterns does not constitute consent.”
Article 25 of the GDPR also requires organisations to integrate data protection concerns into every aspect of their processing activities by implementing “data protection by design and by default.” On 27 April 2019, Giovanni Buttarelli, the former European Data Protection Supervisor, commented that: “The principle of privacy by design and privacy by default are fundamental components of the GDPR… dark patterns are a way for companies to circumvent these principles by ruthlessly nudging consumers to disregard their privacy and to provide more personal data than necessary. That is a problem we have to address.”
What are examples of dark patterns?
Dark patterns include, but are not limited to:
- ‘address book leeching’ which coerces the user to share their contacts’ personal data in order to use the service;
- ‘bad default’ options which manipulate the user to share unnecessary personal data;
- ‘bait and switch’ which deceives the user into thinking that their action will lead to a desirable outcome;
- ‘bundled consent’ which manipulates the user into hastily providing uninformed consent, for example presenting a single option to simultaneously agree to both the privacy policy and the terms and conditions;
- ‘confirm shaming’ which manipulates the user into accepting or not objecting, for example presenting options to click either: “Sign up to our newsletters and be fully informed!” or: “No, I prefer to remain in the dark”;
- ‘disguised content’ which deceives the user into selecting content or navigation by making it appear as something else;
- ‘false urgency’ which coerces the user into making a decision, for example displaying a countdown timer to the effect that the decision must be made within 5 minutes or it will be taken out of the user’s hands;
- ‘forced registration’ which coerces the user into registering for an account to use a service when it is not strictly necessary;
- ‘hidden stipulations’ within complex privacy notices that are deliberately hard to understand as they may otherwise discourage the user from disclosing their personal data;
- ‘immortal accounts’ which hinder or prevent users from deleting their account;
- ‘information milking’ which manipulates the user into providing more personal data than necessary to use the service;
- ‘intentional misdirection’ which deceives the user into making a particular choice, for example using a dull grey colour to discourage selection of the standard option and a bright red colour to encourage selection of the non-standard option;
- ‘privacy zuckering’ which deceives the user into disclosing more personal data than they intend by making privacy settings complex and hard to interpret;
- ‘roach motel’ which makes it simple for the user to consent to the processing of their personal data, but difficult to withdraw consent; and
- ‘shadow profiles’ which are hidden profiles of users with additional personal data of which they are unaware.
Countering dark patterns
In light of this, it is important that organisations ensure that:
- they integrate expectations of basic data protection knowledge into recruitment and promotion processes for their IT designers and marketing staff;
- their data protection teams routinely collaborate with their IT designers to ensure data protection by default and design; and
- their data protection teams and IT designers continually receive support from senior management in respect of data protection by default and design, particularly in respect of disagreements which may occur with other teams which may stand to benefit from dark patterns, for example marketing and sales.
Trilateral’s Data Protection and Cyber-Risk Team has significant experience supporting organisations in respect of privacy by design and default. We offer a range of data governance services, including audit and assessment, compliance support, as well as robust training programmes to help employees and senior management facilitate a data protection culture within your organisation. For more information please feel free to contact our advisers, who would be more than happy to help.