What is data protection by design exactly? The basic idea is that consideration of the impact of any processing activities when developing a new product, technology or service should be taken into account and from the onset and through the lifecycle of the product. Security and privacy measures should be integrated into the project, rather than afterwards in a “checkbox” exercise. Companies and organisations who act quickly to understand the new regulatory requirement, and even embrace it with a proactive attitude, are in pole position to ensure their products and services are compliant for the brave, new, GDPR world.
The origins of data protection by design and the seven principles
The concept of data protection by design is far from new, with some of the initial discussion and considerations for the topic stretching back as far as the 1970s. What is new is the fact that Regulation (EU) 2016/679 (General Data Protection Regulation – GDPR) now formally requires organisations to take privacy by design into account from the conception of a new product, technology or service (Article 25), rather than on a voluntary basis as it was under the previous regime of Directive 95/46/EC (recital 46). The shift from a recital to a fully-fledged article, imposing a legal obligation is a positive step forward for data protection as a whole.
The modern version of data protection by design (and default) can be traced back to seven principles of privacy by design, as developed by the former Information and Privacy Commissioner of Ontario, Ann Cavoukian:
Proactive not reactive, preventative not remedial
Being proactive means that data privacy pitfalls should be anticipated, planned for and prevented before they can materialise, rather than remedied on a reactive basis. This ancillary benefit of this type of approach is potential protection from public exposure of data privacy issues which could cause reputational harm (e.g., the 2017 Equifax breach).
Even from the initial onset of developing a new product, technology or service, organisations should begin to plan the implementation of data-protection-enhancing measures.
- Privacy as the default
The maximum degree of privacy settings should be enabled by default for the user when they use any system or access any service. This means that if the user does nothing to change the standard settings, their protection remains maxed. This guarantees that no action is required on the part of the user to protect their privacy.
Privacy by default also extends to retention periods: personal data should only be kept and stored as long as it is necessary for the operation of the product or service, and this often translates into creating processes for the definition and operation of retention periods. Products, technologies and services should by default protect individuals’ data to the maximum, even if organisations may still want to include options where the data subject can disable these measures. Empowering data subjects with choice over what happens with their data is a keystone of the new data protection regime. - Data protection embedded into design
Privacy measures should form an integral part of the whole system/service, rather than being tacked on at the end of the development cycle. The advantages to “baking in” these measures are that data protection becomes an integral part of the product, technology or service, affording the highest degree of protection from the very start. - Full functionality, positive-sum, not zero-sum
Functionality of a product or service should not be diminished from trade-offs from “false dichotomies” such as privacy vs security, but rather an approach should be adopted where both can be facilitated in a “win-win” situation. - End-to-end security for the lifecycle of the product
Privacy by design must consider security from the “cradle to the grave”. Information is always afforded the appropriate security throughout the lifecycle of the product (from collection to processing and finally destruction). There should be no gaps where security measures are not applied to data processed. Selecting and ensuring the appropriate data security measures are applied to the product, technology or service from the beginning of the project is essential to meeting this requirement. This principle interfaces with the first principle, whereby security measure should be present to stop privacy incidents from happening in the first place, rather than patching them up once they have occurred.
- Visibility and transparency
Data subjects who are having their information processed are entitled to be fully aware of what is actually happening with their personal data from the point it is collected to the point it is expunged.
The GDPR takes an active role in enhancing visibility and transparency for data subjects by greatly expanding the rights over their personal data in Chapter III. Having robust processes for Chapter III rights such as Data Subject Access Requests or Right to Erasure requests is an important step for the privacy by design approach. - Respect for user privacy
Privacy for the user should be the number one concern for the product, technology or service. The goal is to provide a user-centric experience, rather than one which harbours illicit data processing practices such as mass collection of data or invasive profiling.
Having the data subject feel like they are king of the product, technology or service, rather than just a number, is also a good way to build consumer confidence. Big-data is ever coming under increased attack for treating individuals like cattle, milking them for personal data which is then commoditised.
Translating the old principles into the new regulation
Whilst Cavoukian’s 7 principles form much of the backbone of modern data protection by design, Article 25 GDPR is the formal legal requirement for data controllers processing the data of EU data subjects. It provides:
- Taking into account the state of the art, the cost of implementation and the nature, scope, context and purposes of processing as well as the risks of varying likelihood and severity for rights and freedoms of natural persons posed by the processing, the controller shall, both at the time of the determination of the means for processing and at the time of the processing itself, implement appropriate technical and organisational measures, such as pseudonymisation, which are designed to implement data-protection principles, such as data minimisation, in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of this Regulation and protect the rights of data subjects.
- The controller shall implement appropriate technical and organisational measures for ensuring that, by default, only personal data which are necessary for each specific purpose of the processing are processed. That obligation applies to the amount of personal data collected, the extent of their processing, the period of their storage and their accessibility. In particular, such measures shall ensure that by default personal data are not made
accessible without the individual’s intervention to an indefinite number of natural persons.
- An approved certification mechanism pursuant to Article 42 may be used as an element to demonstrate compliance with the requirements set out in paragraphs 1 and 2 of this Article.
Whilst the 7 principles exist as an abstract for what data controllers should do, Article 25 is the realisation of these principles, taking the concept further and obliging new technology to incorporate the fundamentals of data protection.
Key fundamentals of these provisions include the phrases “… the cost of implementation” and “…implement appropriate technical and organisational measures…,” These phrases within the Article provide scope for companies and organisations of differing sizes and complexity the opportunity to select the best and data protection practices suitable to their undertaking. For example, it is not expected that a small start-up, with a relatively low turnover of personal data to be applying the same level of technical and organisational measures as a large multinational dealing in big data. Conversely, this works the other way around as well, companies dealing in big data should be paying close attention to the requirements of Article 25 when developing and deploying new products, technologies or services.
Article 25(3) also contains a mechanism for the emergence of certification schemes for privacy by design. It will be interesting to watch how these various certification schemes are to develop over the coming years and the impact they will have over data privacy and protection not only in the EU but on the international stage. In some cases, we may see a situation where organisations won’t be able to win tenders or proposals without a seal or certification for data protection by design.
Finally, it is also worth mentioning that data protection by design has also been incorporated into both the UK and Irish Data Protection Acts (2018), in section 57 (UK) and section 76 (Ireland). In the case of Ireland, it is simply an exercise of transcribing the GDPR into national law, however, with the uncertainty of the legal status of the Regulation post Brexit, it is assured that the UK has committed to data protection by default regardless of its relationship come the final guise of Brexit.
Effective operation of data protection by design and Data Protection Impact Assessments
A Data Protection Impact Assessment (DPIA) is a tool to operationalise data protection by design. Conducting a DPIA at the beginning of the project can give a sense of how the overall product, technology or service will have an effect on privacy. In some cases, a DPIA is mandated by the GDPR when a particular type of process is proposed to be undertaken (see Article 35 GDPR).
A DPIA is designed to accomplish three goals:
- Ensure conformance with legal and regulatory requirements (The GDPR in this case)
- Provide an assessment for risks and lasting effects
- Evaluate measure and protections which can be taken to mitigate identified risks.
Conducting a DPIA is typically done so by answering a specific set of questions based on privacy requirements, and in some cases, will be conducted parallel to risk assessment. Similar to a risk assessment, a DPIA can also be used to communicate and evidence how a new product, technology or service will affect privacy to the highest levels of the organisation, enabling them to account for privacy by design in the decision-making process as well as keeping other members within the organisation aware of the overall impact.
Concluding, the evolution of the original privacy by design concept culminates in the GDPR now requiring consideration for data protection by design. Many companies and organisations may just see this as more red-tape for the development and delivery of new products to market. However, those who embrace the changes and view data protection by design as an opportunity are in the best position to gain consumer confidence in a market where people are becoming more aware of data protection and the rights over their data.
For more information visit the Trilateral Data Governance page page and contact our team: dpo@trilateralresearch.com