Privacy by Design: designing data protection from the beginning rather than the end

Since 1995, Ann Cavoukian, who later became a member of the Ontario Information and Privacy Commission (CANADA), proposed the concept of Privacy by Design, which over time, has been applied in academic and professional productions.(1) 

In October 2010, organisers of the International Conference of Data Protection and Privacy Commissioners unanimously adopted a resolution recognising the integrated protection of privacy as an essential element of privacy. In this way, the European legislator has integrated this notion(2) into the European legal framework as an essential tool to ensure the confidence of the citizens of the Union. 

Specifically, Privacy by Design consist of studying the impact on privacy from the outset of a personal data processing project.

This preventive approach also aims to guarantee respect for privacy in the operations of technical systems, through the entire period of use of the data.  

Privacy by Design also provides a default protection: Privacy by default (the strictest privacy configuration should be applied without any action on the part of the end user). These two concepts have been introduced into the GDPR in its article 253 which places the responsibility of any controller to implement appropriate technical and organisational measures to guarantee privacy. 

These aspects constitute a real innovation that can facilitate the articulation with one of the key principles of the law, the principle of transparency, which makes it possible to demand that any information addressed to the public or to the person concerned be concise, easily accessible and easy to understand, expressed in clear and simple terms and, where appropriate, illustrated with visuals. For the European legislator, regarding the management of data protection, the transparency in the information delivered to the data subject must be understood both in content and in form. The challenge of this declaration for professionals is to overhaul the way the elements are made available to all users of digital products or services. 

Legally, Privacy by Design is based on applying the principles of data protection from the beginning. It is a question of integrating and applying the transparency of the personal data and the collection of the consent from the conception of the design of the interfaces. This principle must therefore be an integral part of the product and is intended to enrich its features. 

Unclear and / or accessible information will not be considered compliant, the mere presence of information is not enough: the way in which it is presented will make it clearer that we comply with the regulations.

In its IP book of January 2019 (Page 10) (4), the laboratory CNIL (LINC) pointed out these terms: “Although article 25 does not seem to explicitly address designers, it nevertheless indicates the “design of privacy”, the way in which the different techniques of design are used in the production of services for the protection of data in particular with regards to the great principles of data transparency, consent and the rights of individuals. Therefore, the GDPR is a gateway which places the control of the user’s data (above all, respect for the self-determination of informational) at the very heart of the regulation. 

Design applied to the transparency of personal data and the exercise of rights must be a lever to implement the concepts of Privacy by Design by default. Therefore, a new approach to the management of personal data should be used in order to make it be more intelligible to end consumers. 

For this reason, the design of infrastructures that involve the processing of personal data of a user must integrate a duty of loyalty and transparency (5). As an example, Google has recently been sentenced by the CNIL to a fine of 50 million euros for not having obtained a valid consent in accordance with the GDPR when registering users on Android without having fulfilled its obligation of transparency in terms of information. 

In this case, CNIL notes that the general design of the information chosen by the company does not make it possible to comply with the obligations of the Regulation. In fact,, the information that must be disclosed to individuals according to section 13 is excessively dispersed in several documents: Privacy Policy and Terms of Use, displayed during the creation of the account, then Terms of Use and Regulations of confidentiality which are accessible a second time by means of clickable links appearing on the first document. These different documents include buttons and links that must be activated to obtain additional information, resulting in a fragmentation of information forcing the user to multiply the clicks necessary to access the different documents. The user must then carefully review a large amount of information before the relevant paragraph(s) can be identified. However, the work for the user does not stop there, since he still will have to cross-check and compare the collected information in order to understand which data is collected according to the different settings chosen…

In summary, GOOGLE is criticised as a controller for implementing an overly general information infrastructure that fails to meet the obligations of the Regulation in terms of information and transparency with regard to its users. To sum up, the information provided by GOOGLE in these platforms was neither accessible nor understandable to a person who does not have knowledge of the issues of privacy on the web.   

This message sent by the CNIL was intended to highlight the link between regulation and design:

o The inadequacy of design of the “CMP” interfaces (CONSENT MANAGEMENT PLATFORM) intended for obtaining consent to comply with the information requirement. o The design becomes part of the triangle of compliance (legal, technical and design) according to the statements by the President of the CNIL, January 17, 2019 at the launch of the IP book. o The supervisory authority also mentions that “the investigations correspond to the scenario chosen to carry out the on-line verification, namely the steps to be followed by the user and the documents to which the user has during the initial configuration”. The reference to the design of the user interface designed by GOOGLE was decisive and related to the requirements of Privacy by Design. o Legal design and Privacy Tech (7) solutions through the Privacy Icons can be the first response elements that allows an organisation to display in a clear, graphically readable way, its commitment to the protection of personal data for users.

As a result. some tools are beginning to emerge to put design at the service of transparency, for example :

1. The different sets of icons exist to simplify understanding, like those of the Association Privacy Tech ;

2. The presentation of different layers of information: simple and fast information about the most important elements and the possibility to go more in detail for those who wish to ;

3. Contextual information and consent requests: not asking everything at once, but asking for consent when it is relevant and necessary ;

Illustration of example 3: In the case of an application where you can make live videos to stream online. The application may ask when the live video will start if the user wishes to activate and display the geolocation; rather than asking for it to install the app. This limits the actions for registration and increases the chances of obtaining consent, which ultimately is useful for the application

4. When obtaining consent not only must the purpose of consent be specified, but also in which ways the data is particularly useful and the value the person concerned will derive from it.

5. Clear legal languageshould be used. This implies that a person of 12 years old should be able to understand what is stated, both during the consent phase and in the privacy policy. 

6. As in any good design: we must always test its approaches to the public and be ready toadapt, if necessary.

As such, the process of transparency (8) implemented by the young French social network Jollyclick is innovative in the way of addressing its users about the control of their datas.

Various companies, including Visions (9), offer methods and tools to address this issue and increase transparency. It is undeniable that any product based on personal data must put the design of its interfaces at the service of transparency; not only to be compliant but also to assure the trust of its users to process and share their personal data. 

There is still a lot of progress to be made, with questions that are currently still to be answered, such as: 

  1. Icons: are they understandable? 
  2. Consent: what level of information should be provided? Should the data retention period be indicated or not? 
  3. Transparency: Should a common terminology on data processing be adopted? 

On the other hand, presenting the information transparently should not be interpreted as “laundering of Privacy” for organisational practices and mismanagement of consent and rights in general. The VisionsTrust tool ensures compliance with the information system’s permissions to ensure that the application uses only authorised data at all times. Similarly, companies using VisionsTrust are subject to at least one biannual independent audit to ensure that all rights are properly respected by the company. Of course, the tool provides the means in general to guarantee the integrity of the system, compliance with the permissions granted by the user, as well as the circuit and the operating conditions of the data.  

It is therefore advisable to promote active Privacy by Design standards. Standardisation and good practices should be carried out beyond the presentation of consent. For example, it is necessary to establish guidelines so that when developing a search engine, a matching algorithm, or a content recommendation algorithm, one can make sure that they work with a granularity of information that does not depend on all data. Organisational, technical and design measures must be fully aligned to effectively respect and be in accordance with the principle of Privacy by Design. To remove one of these three parts is to contradict the very principle. 

Authors: Pape Drame (Hub For Health) & Matthias De Bièvre (Visions)


1 Ann Cavoukian, Privacy by Design : The 7 Foundational Principles, Information and Privacy Commissioner of Ontario, 2009. Resources/7foundationalprinciples.pdf (consulté le 07/12/2018)

2 Privacy by Design is based on seven principles : ●Proactive not reactive; preventative not remedial; ●Privacy as the default setting; ● Privacy embedded into design; ● Full functionality – positive-sum, not zero-sum ; ● End-to-end security – full lifecycle protection ; ● Visibility and transparency – keep it open ; ● Respect for user privacy – keep it user-centric.

3 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of personal data and the free movement of such data

4 IP book of January 2018 of the CNIL laboratory, the LINC 5 “MEMENTOSAFE”, legal data protection blog,, article by Marie-Claire Peroux entitled “First conviction GDPR GOOGLE January 21, 2019, SMEs Concerned” 6 -Deliberation n°SAN-2019-001 of January 21st, 2019 pronouncing a financial penalty against the company GOOGLE LLC

7 Non–profit organisation that develops collaborative innovation projects to identify, promote and jointly develop legal-technical solutions for the protection of privacy on the Internet. As such, it offers its ecosystem icons to graphically display its commitment to privacy.

8 Article by Matthias De Bièvre entitled Jollyclick, a young French social network: “1st implementation of ethic data”èreimplémentation-de-la-data-ethic-511afed86457

10 Banck  et  D.  Rahmouni,  The new GDPR  tool for soft digital legislation in Europe  , Revue Lamyline intangible right n°151 -2018, published in September 2018.