The Swedish Data Protection Authority fined a school to pay about 20,000.00 euros. The measure is currently available only in Swedish; therefore, we propose the news with a brief comment without further details.
What has it happened?
A Swedish school used a facial recognition system on the students to verify their attendance. During the preliminary investigation by the Swedish supervisory authority, the school defended itself by stating that the students expressed their consent. The supervisory authority closed the investigation sanctioning the school.
What are the relevant aspects?
The starting point is the use of a facial recognition system. With this expression, we usually refer to one or more algorithms based on Artificial Intelligence (IA) and Machine Learning (ML), through which it is possible to identify a person by acquiring the face image from a photo or a video. The algorithm processes the image using mathematical models and returns, storing in a database data that refer to the person.
As it is known, Regulation (EU) 2016/679 (GDPR) in recital (51) states, among other things: “The processing of photographs should not systematically be considered to be processing of special categories of personal data as they are covered by the definition of biometric data only when processed through a specific technical means allowing the unique identification or authentication of a natural person”.
Therefore, the use of images that portray people faces through specific technical systems allowing their unique identification, it consents to classify data processed as “biometric“.
Article 4(1)(14) of the GDPR provides the definition of biometric data: “personal data resulting from specific technical processing relating to the physical, physio logical or behavioural characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images or dactyloscopic data”; in essence, what matters is how the data of a person’s face image is processed.
Is it possible to process biometric data and in particular those deriving from facial recognition?
Generally speaking, the processing of personal data, including biometric data, is permitted in compliance with the GDPR rules only.
Therefore, it is required a coordinated interpretation of more rules laid down by the GDPR. In particular, the processing of biometric data referring to a person’s facial image for specific purposes, by having previously informed the data subject, requires the existence of one of the legal bases envisaged by article 6 of the GDPR as a condition of the lawfulness of the data processing.
Article 9(1) of the GDPR states a general prohibition on the processing of biometric data. However, article 9, paragraph 2, specifies 10 cases where the processing is lawful and, among them, the only one applicable to the present case is the first, namely the data subject’s consent “explicit consent to the processing of those personal data for one or more specified purposes”. We must read article 9 of the GDPR coordinating it with article 6, which indicates the conditions of lawfulness of processing, including the data subject’s consent. We should emphasise that the consent must be “explicit” (i.e. clear and freely expressed by the interested party) and provided “for one or more specified purposes” (the data controller has, therefore, to inform the data subject about the purposes of the biometric data processing).
Furthermore, further coordination with article 7, which indicates the conditions of consent: it must be free, that is expressed without any conditioning by the data subject, who also has the right to withdraw it at any time. The burden of proving the data subject’s consent to the processing of his or her personal data rests with the data controller.
It is worth noting that in Italy, after Legislative Decree 101/2018 that modified the “Privacy Code” (Legislative Decree 196/2003) by adapting it to be more appropriate to the GDPR, there are some norms concerning biometric data with express reference to health. In the case of facial recognition, the processing of biometric data is possible if the data subject gave his free and explicit consent for specific purposes. These are the prerequisites and conditions according to the legislation on the protection of personal data.
Furthermore, the facial recognition algorithms must comply with the principle established by article 25 of the GDPR “Data protection by design and by default”, universally known as Privacy by Design (PbD). Therefore, the controller shall implement appropriate technical and organisational measures both in the design and processing phase of the algorithm (and when determining the means of processing), as the article 25 says. In fact, the controller has to intervene “both at the time of the determination of the means for processing and at the time of the processing itself“, and “in an effective manner and to integrate the necessary safeguards into the processing in order to meet the requirements of the GDPR and protect the rights of data subjects”.
Why was the Swedish school-sanctioned?
Schools generally carry out several processing activities for specific purposes in compliance with the conditions of lawfulness established by article 6 of the GDPR by adopting a suitable legal basis. Concerning students, one of the purposes may be that aimed at verifying school attendance. It is necessary to verify how and with which technical systems the processing of personal data is carried out.
We have already said that for the facial recognition the data subject’s consent is necessary; if the students are minors, the consent is given or authorised by the holder of parental responsibility over the child (the given consent by child that is at least 16 years old – or 14 years old for Italy – applies only in relation to information society services). Students must be able to attend school, follow the lessons of the lecturers and be subjected to appropriate checks to achieve the objectives. In this context, the student should not be influenced by the teachers or the school, both concerning their assessments of the educational objectives achieved and about the educational path to be completed. If absurdly, it were not so, the consent expressed by the student could be invalid because it is not freely expressed, that is, without any conditioning, and therefore not validly given. In the same way, if the student gives his or her consent – apparently explicit and free – but conditioned by a reverential fear towards the teachers and the school, the consent itself would be vitiated.
We presume that the vitiated consent (given that the text of the commented measure is only in Swedish) is the motivation that led the Swedish supervisory authority to consider lawless of processing through a facial recognition system. However, it is necessary to consider further aspects that could significantly “limit” the validity of the consent and the data subject’ rights. Since the student is also free not to express consent in the terms set out above, the concrete methods of the face image acquisition using a facial recognition system must be evaluated. In essence, if a school positions a camera, by which acquire a person’s face image, in a commonplace of the building (entrance or hall or corridors), place, that is, where people frequently pass for the use of the school services, this would make the denial of consent by the student useless.
How could the school prevent from acquiring a student’s face who had not ever expressed consent?
This aspect becomes even more important if the right to withdraw consent is taken into consideration; in fact, how would you avoid the acquiring of the face of the student who has decided to withdraw his consent if the camera is positioned so that it films anyone indiscriminately?
One of the aspects to take into account in this complex context, together with what is indicated above, is also the security of processing according to article 32 of the GDPR that says: “the controller and the processor shall implement appropriate technical and organisational measures to ensure a level of security appropriate to the risk“. In essence, the controller and the processor must take appropriate measures to reduce the risk “of varying likelihood and severity for the rights and freedoms of natural persons“. Security is one of the relevant aspects with regards to the processing of personal data.
Can Italian schools use facial recognition systems?
As for Italy, we know that the Garante (Italian DPA) already in 2016 (before the application of the GDPR from 25 May 2018) published a vademecum entitled “La scuola a prova di privacy“.
If it is not possible to refer to other cases, in addition to the consent, provided by article 9(2) of the GDPR, the processing of biometric data is not permitted. As mentioned, the processing is lawful only if there is a condition among those indicated by article 6(1) of the GDPR, including (those that appear to be compatible with the present case):
6(1)(c) “processing is necessary for compliance with a legal obligation to which the controller is subject“; there must be a legal obligation.
6(1)(e) “processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller“; Schools carry out a task in the public interest; this condition must be coordinated with the other one laid by article 9(2)(g) and precisely “processing is necessary for reasons of substantial public interest”; the Italian privacy code, as amended by Legislative Decree 101/2018, mentions in article 2-sexies (Treatment of special categories of personal data necessary for reasons of substantial public interest) the following hypothesis which constitutes a relevant public interest: “bb) education and training in the scholastic, professional, higher or university context”.
It does not seem easy, instead, to place the processing of personal data for purposes of verifying school attendance in the case envisaged by article 6(1)(f) according to which “processing is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data, in particular where the data subject is a child”. The lawfulness condition established by the article mentioned above 6(1)(f) – the legitimate interest of the controller deserves particular attention both regarding recital (47) and the works of the European Data Protection Supervisor (EDPS, formerly Art. 29WP). Article 6, as mentioned above, must be coordinated with article 9 and, among the foreseen cases, those that appear compatible with the activities of a school seem to be the following:
9(2)(e) “processing relates to personal data which are manifestly made public by the data subject“; if you can prove this.
9(2)(f) “processing is necessary for the establishment, exercise or defence of legal claims or whenever courts are acting in their judicial capacity“; but in this case, the purpose of the processing cannot be verifying school attendance, but probably the protection of the school assets (e.g. by installing video surveillance systems).
9(2)(g) “processing is necessary for reasons of substantial public interest“; we refer to what we have previously highlighted.
In Italy, at the moment, no legislation allows schools to validly and lawfully put in place face recognition; it should be carefully assessed the context and the techniques used case by case. Certainly, the measure issued by the Swedish Data Protection Authority, besides being the first on the point, constitutes an authoritative indication to pursue work to raise awareness, especially in schools, on the risks connected to facial recognition systems both for the children both for adults.