Thus far, the European regulatory corpus applicable to facial recognition is composed of:
- 2012: Working Party 29, Opinion 3/2012 on developments in biometric technologies
- 2016: Directive 2016/680 Data Protection Law Enforcement Directive
- 2016: Regulation 2016/679 General Data Protection Regulation (GDPR)
- 2019: EU Agency for Fundamental Rights (FRA), Facial Recognition Technology : fundamental rights considerations in the context of law enforcement.
- 2020: European Commission (EC), White Paper on AI
This article will focus on European legal regimes framing the use of such data, more particularly in the context of facial recognition technologies.
Current legal framework applied to facial recognition
Member States of the EU operate under a unified definition of biometric data regardless which law is applied. Biometric data are “personal data resulting from specific technical processing relating to the physical, physiological or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”.
Two different instruments are relevant to the processing of biometric data at the EU level. Under these legal instruments, the lawfulness of a processing is framed legally and must be documented by the controller. The lawfulness of the processing is not limited to the provision of adequate information to the data subjects or the respect of their rights.
Under the Article 9 of the GDPR, the processing of data such as biometric data for the purpose of uniquely identifying a natural person, data concerning health, data concerning a natural person’s sex life or sexual orientation and the processing of data revealing racial or ethnic origins is in principle prohibited. However, this same article also provides a limited number of cases where the prohibition may be lifted, for instance where the data subject consented to the processing, to comply with specific obligations or to exercise specific rights, to protect the vital interests of a natural person, where only data manifestly made public by the data subject are used and so on.
When applying the Article 10 of the Law Enforcement Directive, the logic is reversed. The processing of biometric data for the purpose of uniquely identifying a natural person, data concerning health, data concerning a natural person’s sex life or sexual orientation and the processing of data revealing racial or ethnic origins are allowed only where strictly necessary, subject to appropriate safeguards for the rights and freedoms of the data subject, and only : (a) where authorised by Union or Member State law; (b) to protect the vital interests of the data subject or of another natural person; or (c) where such processing is related to data which are manifestly made public by the data subject.
Prospective stakes and regulatory interventions
In 2012, the Working Party 29 pointed out that some biometric technologies could be considered mature technologies, such as facial recognition technics, and identified several applications in law enforcement, e-government and commercial systems using biometric data for profiling, remote surveillance or even ambient intelligence. Biometric technologies’ developments are closely tied to the continuous improvement of sensors allowing the collection of new physiological characteristics and new way to process such data.
Last September, the European Commission President, Mrs Ursula von der Leyen disclosed her political guidelines and announced a coordinated European approach on the human and ethical implications of AI for innovation. In the beginning of the year, the release of the European Commission’s White paper on AI detailed more precisely the scope of this future regulatory framework.
Although existing data protection laws will continue to apply to AI applications, certain updates will be necessary. Operating under a risk-based approach, these regulatory interventions should be confined to specific AI applications only. The goal of these regulatory interventions should be to address possible societal concerns on the use of AI, particularly in public spaces, and avoid a legal fragmentation in the internal market. The Commission announced a broad European debate on the specific circumstances, if any, which might justify such uses and on common safeguards to be adopted in Europe.
Among AI applications, its use for remote biometric identification was clearly identified by the White paper. Such use does include the deployment of facial recognition in public places. Remote biometric identification is clearly distinguished from authentication in the Commission’s work. While authentication relies on the unique biological characteristics of an individual to confirm an individual’s identity, remote biometric identification is a mean to establish the identities of multiple persons using biometric identifiers at a distance in a public space, in a continuous manner, by checking them against data stored in a database.
The Commission’s paper highlights remote biometric identification’s stakes for the rights and freedoms of individuals, most notably on people’s dignity and their rights to respect for private life and protection of personal data. There are also serious concerns regarding this technology’s impact on non-discrimination and rights of special groups, such as minors, elders and disabled individuals. The White paper does not merely address commercial uses, but also anticipates impacts on the rights and freedom of individuals in the context of law enforcement as well, highlighting its impact on freedom of expression, association and assembly.
European Member States’ national adaptations
The announced regulatory interventions intervene to circumvent any possibility of serious legal fragmentation of the European data protection regime and impose some oversight. These biometric technologies are now being experimented by some Member States which generated an increased scrutiny of these issues by the general public.
Nonetheless, until such normative production is effectively realised and implemented, the European data protection regime does provide some margin for Member States’ interpretation and specificities. This regulatory approach has been praised for giving Member States to work on national adaptation of some instruments or disposition, yet, this ability could result in a loss of harmony or coherence.
First, concerning the Law Enforcement Directive, not only do Member State have leeway in the manner they chose to transpose the directive’s provision, but Article 10 provides that the processing of special categories of personal data must be authorised by Union or Member State law. Consequently, Member State have a large margin of appreciation when designing laws enabling such processing, including the possibility to frame them strictly or exclude certain practices or technologies.
Second, despite the appearance of a straight-forward and EU-wide harmonised regime, promised by the GDPR, Article 9 expressly provides an opening clause to the benefit of Member States, thoroughly analysed by scholars and lawyers. This clause allows them to “maintain or introduce further conditions, including limitations, with regard to the processing of genetic data, biometric data or data concerning health”.
For instance, under this clause, the Italian implementation of the GDPR expressly incorporates safeguard measures specific to biometric data processing including, among other things, the enumeration of relevant categories of documents produced by the Italian Supervisory Authority and the European Committee for Protection of Data and some criterions which must be used to determine and adapt such measures. The French Supervisory Authority also has the power to prescribe additional technical and organisational measures for the processing of specific data, including biometric data. Further conditions and limitations defined by national laws should be closely scrutinized in the coming years.
Furthermore, Article 9 of the GDPR also allows the Union or Member State laws to provide that consent is not a valid legal basis to lift the prohibition to process special categories of personal data, including biometric information. For instance, Spanish law already considers that consent is not a valid basis for, among others, the processing of data revealing racial or ethnic origin or the processing of data concerning a natural person’s sex life or sexual orientation, in order to avoid discriminatory situations. It remains to be seen whether Spanish judges could consider that biometric data could reveal such information and whether other Member States will decide to introduce similar provisions concerning biometric data in their own national law.
Mathias Avocats will keep you informed of the evolution of this debate at the European Union level.