Skip to main content
Erschienen in:
Buchtitelbild

Open Access 2024 | OriginalPaper | Buchkapitel

AI and Sensitive Personal Data Under the Law Enforcement Directive: Between Operational Efficiency and Legal Necessity

verfasst von : Markus Naarttijärvi

Erschienen in: YSEC Yearbook of Socio-Economic Constitutions 2023

Verlag: Springer Nature Switzerland

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

In constitutional theory, the requirement of necessity is an integral part of a wider proportionality assessment in the limitation of constitutional rights. It fulfils a function of sorting out measures that restrict rights beyond what is required to fulfil the intended purpose. Within data protection, the requirement varies in strictness and interpretation—from ‘ordinary’ necessity to ‘strict necessity’. Recently, the European Court of Justice (ECJ) has introduced what appears to be an even stricter requirement of ‘absolute necessity’ relating to the processing of biometric information under the EU Law Enforcement Directive (LED). In practice, however, the implications of those respective levels of strictness tends to vary, from a strict ‘least restrictive means’ test, to an analysis of whether a measure is necessary for a more effective or a more efficient fulfilment of the intended purpose. In this contribution the principle of necessity as applied by the ECJ is analysed as it pertains to the LED and the Charter, more specifically in the context of implementing AI supported analysis of biometric data. The gradual development of the interpretation of necessity is traced in the data protection case law of the ECJ. The study shows the increased emphasis placed on proportionality over time, highlighting both strengths and potential weaknesses of the requirement in relation to the use of AI supported decision-making in the law enforcement context.
Hinweise
The original version of this chapter was revised: this chapter was previously published non-open access. The correction to this chapter is available at https://​doi.​org/​10.​1007/​16495_​2023_​66

1 Introduction

The possibilities and pitfalls offered by the implementation of AI in government decision-making are currently at the forefront of political, legal, and academic interest. The expanding implementation of AI-assisted decision-making in Europe has been evident in state functions ranging from the administration of public benefits,1 to the prevention of terrorism and other serious crimes.2 Several factors contribute to making full automation challenging in the policing context, and the continued role of human decision-makers exercising judgment and discretion has consequently been highlighted in previous research.3 Even so, the stakes involved in police decision-making and the potential consequences for individuals affected by mistakes caused by algorithmic policing tools have called to the forefront the need for functional safeguards for individual rights as well as the rule of law in this context.4 In line with this, the use of AI tools for law enforcement purposes has been highlighted in the proposed new EU AI-act (‘AIA’) as high-risk, with the proposal outlining several limits to its use.5 However, there are already existing rules and principles relevant for the use of such tools in the data protection framework of the EU, more specifically in Directive (EU) 2016/680 (‘the Law enforcement directive’, or ‘LED’).6 Recently, these rules have been subject to significant developments through ECJ case law, which independently of the final form of the AIA will place certain limits on the use of AI for policing, particularly those pertaining to the processing of sensitive personal data under the LED.
To better illustrate these developments and put them into perspective, we can turn to the specific example of automated facial recognition. This technology is of particular interest as it pinpoints a core challenge of law enforcement—the identification of unique individuals—and thus, in the words of Kotsoglou and Oswald, ‘has the potential to revolutionise the identification process, facilitate crime detection, and eliminate misidentification of suspects’.7 Facial recognition technology thus represents a clear example of shifting technological affordances,8 as technology now affords a type of rapid mass-identification of individuals which previously would have been impossible. This in turn shifts the implications of the wide availability of images and video.9 The technology is therefore simultaneously associated with significant risks, due to the potential consequences of a misidentification.10 For the individual, some of the most obvious risks relates to false positives, being wrongly identified as a suspect which may trigger further, potentially far-reaching intrusive investigative measures. For law enforcement agencies, false negatives imply a risk of failing to identify a suspect, while false positives on the other hand will divert investigative resources towards unfruitful avenues or worse, towards suspects that may wrongfully be charged with a crime.11 The risk of false positives have been said to be higher for individuals belonging to overpoliced communities, adding a discriminatory potential to the technology.12 There is also the risk that through automation bias, the outcome of the application of facial recognition systems may be favoured in the assessment of evidence, even faced with contradicting evidence.13
However, the potential harms of facial recognition technology go beyond false positives. The European Data Protection Board (‘EDPB’) has highlighted that the processing of biometric information implied by such systems constitutes a serious interference with the fundamental rights without taking the outcome of the processing into account. Indeed, so are legislative measures allowing their use under Articles 7 and 8 of the Charter.14 The EDPB goes even further, pointing to how biometric data and facial recognition technology may impact the right to human dignity under Article 1 of the Charter, which requires that human beings are not treated as mere objects. This observation is based on how facial recognition technology ‘calculates existential and highly personal characteristics, the facial features, into a machine-readable form with the purpose of using it as a human license plate or ID card, thereby objectifying the face’.15 However, the EDPB does not mention the potential implications of this—the right to human dignity is inviolable, so should facial recognition technology be found to interfere with it, it cannot be justified under any circumstances.16 Still, the risks involved have so far not been held to generally preclude facial recognition systems being put to use in law enforcement agencies, as long as their use can be justified as legitimate interferences under the relevant fundamental rights, with the relevant data processed in line with the requirements of the LED, and authorised by law.17 However, certain applications of facial recognition technology have been highlighted as particularly invasive by the EDPB. These include remote automated biometric identification of individuals in public spaces, biometric profiling, emotion recognition, and populating facial recognition databases through scraping of social media accounts. While the specifics are at this point still under negotiation, many of these applications are likely to be specifically regulated by the upcoming AIA.18 However, in the shadow of these more contentious applications of biometric identification exist a range of applications that could be considered as more mainstream. This includes the forensic matching of suspects between videos or photos appearing in a criminal investigation, to existing police databases. Such applications may not raise the specter of mass-surveillance to the same extent as those highlighted by the EDPB, but still involves the processing of sensitive biometric data of individuals, the risk of false positives (and negatives), and involve what is generally perceived of as AI technology to reach law enforcement objectives.
One such automated facial recognition system has indeed recently been implemented by the Swedish police authority and will serve as our example of some of the specific issues involved under EU law. This system was introduced following a review in 2019 by the Swedish Authority for Privacy Protection (at the time named ‘Datainspektionen’, henceforth ‘the DPA’). The review came as a result of a request from the Swedish national police authority within the framework of ‘prior consultations’ required under the Swedish transposition of the LED before conducting processing operations of high sensitivity.19 The request concerned the use of facial recognition software to forensically match suspects in crime scene surveillance footage or other forensic imagery against the national database of criminal offender photographs (‘Nationella signalementsregistret’). This would, according to the police authority be more effective than traditional and time-consuming manual analysis by human investigators.20 Having considered the request, the DPA eventually found in October of 2019 that the use of facial recognition was allowed. It based this conclusion primarily on the judgment of the ECJ in Case C-524/06, Heinz Huber,21 finding that the requirement of the processing being ‘necessary’ in the Swedish act transposing the LED should be interpreted as ‘something that is needed to effectively carry out a task’. The DPA consequently found that it was ‘clear that the planned processing using facial recognition technology to identify perpetrators is significantly more effective than individual officers making this selection manually’ and approved the use of the software in this context.22 Following this decision, the use of biometric matching was implemented within the Swedish police as a new forensic standard procedure in May 2021. Photos of unknown persons are now processed through facial recognition and matched against existing photos in the newly developed police database ABIS (Automatic Biometric Identification System).23
The finding by the Swedish DPA that enabled the implementation of this system raises, however, a few issues of both practical and theoretical importance, and highlights what appears to be a tension in the interpretation of the requirement of necessity under European data protection law, especially in relation to the processing of sensitive personal in the law enforcement context.
First, there is the question of what level of necessity should be applied in considering the processing of biometric data under the LED. Article 10 of the LED states that such processing should be allowed only when it is ‘strictly necessary’. The Swedish transposition of these rules in the context of forensic investigations state that biometric data can be processed when it is ‘absolutely necessary’ for the purpose of the processing,24 which includes forensic matching.25 But the necessity level the Swedish DPA applied in its decision was that of ordinary necessity,26 as was the requirement interpreted by the ECJ in Heinz Huber, which Swedish DPA relied on.27 Meanwhile, the ECJ has in numerous cases held that all derogations from the right to data protection must be limited to what is strictly necessary,28 and has recently established what seems to be an even higher level of necessity in relation to the processing of biometric data under the LED.29
Second, the decision by the Swedish DPA highlights the rather opaque relationship that seem to exist between the necessity requirement and effectiveness. Given that manual analysis of surveillance footage is possible, and indeed represents the way this has been done for years, one could argue that the use of facial recognition software is not so much necessary, as it is more efficient. But is comparable efficiency enough to satisfy the requirement of necessity in this context?
Third, the analysis by the Swedish DPA contained no further assessment of proportionality strictu sensu, where the interest of the Swedish police to effectively analyse crime-scene footage was weighed against the damage done to right to data protection. Indeed, no such requirement seems to follow from the wording of the LED, where Article 10 only mentions that processing of sensitive personal data should be ‘allowed only where strictly necessary’. However, the ECJ has increasingly made use of proportionality balancing in relation to derogations from the right to data protection.30 How could these developments impact the legal requirements surrounding the use of biometric data and AI technologies in this context?
Analysing these questions allows for a fourth and final question; what impact will this have on the use of emerging AI-technologies as decision support or analysis tools in the law enforcement context, and the intended function of the LED to protect personal data in this setting? This final question is of salience in light of the proposed AIA, as it will determine what role the act will play as a further necessary safeguard in relation to processing of sensitive categories of data through AI technologies.
In this paper, these questions will be analysed in light of case law from the ECJ interpreting the LED and the GDPR; guidelines from the European Data Protection Board; decisions from national data protection agencies; as well as literature surrounding proportionality as a principle of constitutional and data protection law. The discussion will be focused on semi-automated decision-making, rather than fully automated decisions, as the former is (still) more prevalent in the context of both criminal investigations and police intelligence operations where few decisions are taken without first having been reviewed by a human at some stage of the decision-making process. The focus on automation and the implications of AI in this context is thus on its role as potential decision-support tools and automated analysis, rather than on making final decisions without human input.
Before proceeding to the analysis of necessity, we need to take a quick look at the specific data protection context that surrounds processing of personal data within law enforcement agencies.

2 Data Protection in the Law Enforcement Context

2.1 The Law Enforcement Directive: A Very Short Introduction

The LED is part of the overall EU data protection framework along with its more famous cousin, the General Data Protection Regulation (GDPR).31 Given the special context of prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, the directive served as a way of ensuring a more specifically tailored legal framework in that setting.32 Implicitly this gave member states further autonomy in choosing how to implement the necessary level of data protection. The LED has been described as ‘a major step forward’ in data protection showing ‘that it now will be possible to achieve high privacy and data protection standards while processing personal data for law enforcement purposes in a more flexible manner’.33 In comparison to the framework decision it replaced, the directive has been seen as increasing harmonisation as well as data protection standards.34 Yet some characteristics of the LED have also been described as challenging those positive outcomes, particularly ‘the lack of specific guidelines on how certain general concepts such as necessity, proportionality and appropriateness are to be implemented and applied by member states to balance privacy with security and other civil rights’.35

2.2 The Relative (Ir)relevance of the Rules on Automated Decision-Making

Before delving into the specifics of necessity in relation to the processing of sensitive personal data in the LED, it is worth pointing out the specific rules existing on automated individual decision-making (including profiling) in Article 11 of the LED. This rule contains what at first glance may appear to be a prohibition against the use of such decision-making, but one which is subject to several exceptions. First, Article 11 only refers to decisions based solely on automated processing, including profiling. The operative terms used in this rule are the same as in Article 22 of the GDPR, where the consensus have been that automated processes fall outside of the field of application of the article when they remain decisional support tools, ‘provided the human decision-maker considers the merits of the result rather than being blindly or automatically steered by the process’.36 Second, automated decisions as defined in Article 11 LED are still allowed if authorised by union or member state law which provides appropriate safeguards for the rights and freedoms of the data subject. Third, although a specific mention is made in Article 11(2) LED of how wholly automated decisions should not be made using special categories of personal data referred to in Article 10, this prohibition contains a similar exception if ‘suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are in place’. While this type of decision-making is not the focus of this contribution, as the type of decision support and analysis tools that are discussed here only supports human decision-making, it is worth noting that the protections in Article 11 are circumscribed to the point where automated decisions, even based on sensitive personal data like biometrics, are fully possible to implement under the LED, although fundamental rights as interpreted by the ECJ may form an additional layer of protection.37

2.3 The Likely Limits of the Proposed EU AI Act

Before moving onto necessity under the EU data protection framework, it is worth to briefly mention the intended relationship between the LED and the proposed AIA in this context. In relation to the processing of biometric data, the latest draft38 establishes new rules limiting the use of ‘real-time’ remote biometric identification of natural persons for the purposes of law enforcement in publicly accessible spaces. The rules do not include a complete ban against the use of AI technologies for this purpose but limits its use through draft Article 5d to specific situations, such as the targeted search for specific potential victims of crime or the prevention of a specific and substantial threat to the critical infrastructure, life, health or physical safety of natural persons or the prevention of a terrorist attacks. It would also, according to draft Article 5 d (3) be subject to prior judicial authorisation. In this context, the AIA would according to draft recital 23 become lex specialis, and the use of such real-time remote identification could therefore not be based on Article 10 LED. However, all other processing of biometric data and other personal data involved in the use of AI systems for biometric identification, other than in connection to the use of ‘real-time’ remote biometric identification systems in publicly accessible spaces for the purpose of law enforcement would, according to draft recital 24, fall under Article 10 LED. In other words, the LED will still form the main legal safeguard, with the AIA providing additional limits in certain high-risk contexts. As such, the forensic matching we are using as our example here, would, most likely, still be primarily governed by the LED, rather than the AIA.

3 Necessity and Data Protection

3.1 Necessity, Strict Necessity, or Absolute Necessity?: The LED Meets the ECJ

3.1.1 The Early Discussion Surrounding Necessity in the LED

One of the previously mentioned flexibilities implied by the LED is that in contrast to the GDPR, the processing of sensitive categories39 of personal data is not prohibited as a rule, rather it is restricted through Article 10 of the LED to situations where it is strictly necessary and subject to appropriate safeguards for the rights and freedoms of the data subjects. This differentiates sensitive data from the less stringent requirement of ‘ordinary’ necessity required for processing of non-sensitive data established in Article 8 of the LED. This reflects the legitimate need for law enforcement agencies to process sensitive categories of data in some of their tasks, from the use of fingerprints (i.e. biometric data) to identify suspects at crime-scenes to the processing of data on the religious or ethnic background of victims in cases of hate crime. Given the sensitivity of this data, the requirement of strict necessity makes it clear that processing of such data should not be a routine measure.
This two-pronged approach to necessity stemming from the text of the LED has however for a long time become increasingly muddled in light of case law of the ECJ in the data protection context. The ECJ has over time established strict necessity as a basic requirement for derogations and limitations of the protection of personal data, without limiting this analysis to the protection of special categories of sensitive data.40 While the cases establishing this level of necessity has concerned primarily the previous data protection directive and the GDPR, the European Data Protection Supervisor (‘EDPS’) has concluded that this strict necessity applies as an overall requirement within the context of the LED as well.41 But if ‘plain’ necessity should be construed strictly, what is then to be made of the specific reference to strict necessity in Article 10 LED? The conclusion of the Article 29 Working Party (the precursor to the current EDPB) was that ‘the term “strictly necessary” in Article 10 must be understood as a call to pay particular attention to the necessity principle in the context of processing special categories of data, as well as to foresee precise and particularly solid justifications for the processing of such data’.42 Recently we received some clarity in this matter, as the ECJ interpreted the requirement of necessity under Article 10 LED for the first time in a ruling that partly confirmed these views, but which arguably went even further—establishing what must be regarded as a new threshold of necessity above strictly necessary.

3.1.2 Absolute Necessity in Criminal Proceedings Against V.S.: Establishing a New Threshold

In Case C-205/32, Criminal proceedings against V.S., delivered in January of 2023, the ECJ considered a request for preliminary ruling from a Bulgarian court on the interpretation of the LED.43 With regards to necessity, the ECJ interpreted the question posed by the Bulgarian court as whether Article 10 LED (read in conjunction with general principles of processing personal data in Article 4 LED and the requirements of lawful processing in Article 8 LED) precluded national legislation which provided for the systematic collection of biometric and genetic data of any person accused of an intentional offence subject to public prosecution. Specifically, when no specific obligations were established in national law for the competent authority to determine and demonstrate the necessity of that collection for specific objectives pursued and that those objectives could not be achieved by collecting only a part of the data concerned.44
The ECJ began answering this question by acknowledging the special sensitivity of both the data at issue and the context of their processing. These implied, the court held, significant risks to the Charter rights to respect for privacy and for data protection.45
Proceeding to the specific interpretation of ‘strictly necessary’ in Article 10 LED the ECJ focused on the difference between strict necessity in that article, and the requirement of necessity in the directive in general. Here, the court specifically highlighted certain linguistic factors. The court began by taking note of the French-language version of the LED, where the term used in Article 10 was nécessité absolue’, which the court found established ‘strengthened conditions for lawful processing of sensitive data’.46 This led to two interesting conclusions:
Thus, first, the use of the adverb ‘only’ before the words ‘where strictly necessary’ underlines that the processing of special categories of data, within the meaning of Article 10 [LED], will be capable of being regarded as necessary solely in a limited number of cases. Second, the fact that the necessity for processing of such data is an ‘absolute’ one [(‘absolue’)] signifies that that necessity is to be assessed with particular rigour.47
The court then proceeded to tone down the significance of how the term ‘strictly necessary’ was used in certain other language versions, as those words also implied a strengthened condition. In this context the court also took note of how that requirement had been added late in the legislative process, implying an ambition to give greater protection to persons subject to such processing.48
It is interesting to note how the court in Criminal proceedings against V.S. has essentially established a brand-new standard of necessity, above strict necessity, thus resolving the terminological paradox of ordinary vs. strict necessity in the LED. In doing so, the court also made it clear that this level of necessity will only be reached in a limited number of cases, thereby signaling the exceptional nature of the processing of this type of sensitive data. The court went on to observe that this level of necessity entails particularly strict checking of that the requirement of data minimisation under the LED has been met. This interpretation also confirms one made by the EDPS in its proposed guidelines on the use of facial recognition technology in the area of law enforcement,49 which were published for public consultation in mid-2022, prior to the ruling in V.S. Here, the implication of strict necessity under the LED was interpreted by the EDPS as the measure needing to be indispensable. ‘It restricts the margin of appreciation permitted to the law enforcement authority in the necessity test to an absolute minimum’.50 The EDPS also tied this to the requirement of objective criteria defining the ‘circumstances and conditions under which processing can be undertaken, thus excluding any processing of a general or systematic nature’.51
There is one more significant thing to note in the ECJ ruling in V.S. The court held that given the strict review that is warranted, it is also necessary to consider the nature of the objective pursued, to ensure that the processing is connected to ‘the prevention of criminal offences or threats to public security displaying a certain degree of seriousness, the punishment of such offences or protection against such threats’.52 Implicitly, the court here moves beyond necessity and into proportionality balancing, but without making any explicit reference to the latter principle. To understand the significance of this, we need to look at how the ECJ case law on necessity has developed up to the present ruling. As we will see, proportionality has played an increasingly important role in this case law, going from largely absent to explicitly used to limit restrictions of the right to data protection.

3.2 The Emergence of Strict Necessity and Proportionality in the ECJ Case Law on Data Protection

3.2.1 From Strict Necessity as an Interpretive to an Overall Requirement

In most of the cases leading up to Criminal proceedings against V.S. where the ECJ has expounded on the necessity requirement, it has done so in relation to national or union legislative measures derogating from the right to privacy and data protection. In the first case where the court mentions this application of strict necessity, Case C-73/07, Satakunnan, it was in explaining the specific derogations allowed by specific chapters of the Data Protection Directive (DPD) as they were to be applied in the balancing against freedom of expression with regards to the publication of tax-records by a newspaper.53
In order to take account of the importance of the right to freedom of expression in every democratic society, it is necessary, first, to interpret notions relating to that freedom, such as journalism, broadly. Secondly, and in order to achieve a balance between the two fundamental rights, the protection of the fundamental right to privacy requires that the derogations and limitations in relation to the protection of data provided for in the chapters of the directive referred to above must apply only in so far as is strictly necessary.54
This case can arguably be said to establish mainly that the specific derogations allowed by the DPD should be interpreted narrowly to ensure that they are limited to what is strictly necessary. The ruling in Satakunnan was however later referred to in case S-92/09 and C-93/09, Volker und Markus Scheke, which concerned the validity of regulations requiring the publication of beneficiaries of agricultural funds.55 This time the ECJ did not limit its statement to the DPD, but rather held that derogations and limitations in relation to the protection of personal data must apply only in so far as is strictly necessary.56 While widening the scope of the statement from the specific interpretation of the DPD to a more general statement, this ECJ also moved its analysis to the legislative level, i.e. an analysis of a requirement to publish information established by the EU in a regulation, which was not strictly necessary to achieve the aim of the public interest.57
In the subsequent case C-473/12, IPI, the ECJ referred to the requirement of strict necessity as ‘settled case law’ and related it to ‘the protection of the fundamental right to privacy’.58 Again, this case related to a question on the national legislative framework of data protection, specifically the possible exceptions established by that framework in relation to the activities of private detectives.59
Through these three cases, we have moved from the specific interpretation of the DPD to more general considerations about the fundamental right to privacy and data protection, but in doing so, the court has also moved towards an application of the requirement in the context of legislative measures rather than specific processing operations within a specific legislative framework. This continued through case C-293/12 and C-594/12, the Digital Rights Ireland judgment, which also relates to the legislative level, this time the EU data retention directive.60
In 2014 the ECJ produced one of the few exceptions to the tendency to discuss strict necessity on the legislative level, case C-212/13, Ryneš.61 This concerned a specific processing, a home security camera installed by a private person which recorded the entrance to his home as well as a public footpath and the entrance to the house opposite, and whether this processing could fall under the exception in DPD Article 3(2) for ‘purely personal or household activity’.62 The ECJ again repeated its statement on the need for strict necessity, again in relation to the Charter right to privacy, but ended up with a more narrowly tailored implication of this strict necessity—namely that the exception in DPD Article 3(2) ‘must be narrowly construed’.63
In C-362/14, Schrems (I), the ECJ returned to the legislative level and applied strict necessity to the data transfer agreement between the EU and the United States, finding that a generalised storage, transfer, and access, of all the personal data of all the persons whose data has been transferred to the United States could not be strictly necessary for the objective pursued.64 Again, as the court moved to analysing a legal framework surrounding the protection of personal data, the strict necessity criteria seems to be used more actively and with more expansive consequences.
These cases indicate a tendency by the court to look towards the fundamental right roots of data protection to resolve cases before them, rather than being caught in more specific articles of the data protection rules in question.

3.2.2 Necessity, Proportionality, and Automation

An interesting development in the ECJ approach to necessity came the following year, in the 2016 Tele2 judgment (C-203/15 and C-698/15).65 Here, the court again considered the legislative framework surrounding the protection of personal data, this time the Swedish and British national rules on data retention for the purposes of investigating and preventing serious crime. After having up until now touched upon the requirement of necessity as a distinct requirement, the ECJ now found it to be explicitly derived from the principle of proportionality:
Due regard to the principle of proportionality also derives from the Court’s settled case-law to the effect that the protection of the fundamental right to respect for private life at EU level requires that derogations from and limitations on the protection of personal data should apply only in so far as is strictly necessary.66
It is reasonable to assume that the ECJ here talked about proportionality as a wider analytical framework, within which necessity forms a distinct step before proportionality strictu sensu.67 The ECJ went on to make observations surrounding the general and indiscriminate retention of all traffic and location data of all subscribers and registered users relating to all means of electronic communications.68 In doing so, the ECJ did not stray from a more narrow definition of necessity as a means to sort out derogations that unnecessarily impacts the right to data protection, as the court did not begin to discuss balancing against opposing interests until later in the same judgment.69 Still, the explicit reference to proportionality had not been made before, and once mentioned in the Tele2 ruling, it returned in later cases.
In Opinion 1/15 on the Draft agreement between Canada and the European Union for the Transfer of Passenger Name Record data from the European Union to Canada, the court again reiterated that reference to the principle of proportionality, adding that it requires ‘in accordance with settled case-law of the Court, that derogations from and limitations on the protection of personal data should apply only in so far as is strictly necessary’.70 Again, the statement by the ECJ was made in the context of analysing legislative level frameworks and safeguards of data protection, to ensure interference is limited to what is strictly necessary. Interestingly, the court here added that ‘[t]he need for such safeguards is all the greater where personal data is subject to automated processing […] particularly where the protection of the particular category of personal data that is sensitive data is at stake’.71 This statement is directly relevant for our understanding of necessity in relation to the automated processing of biometric data implied by facial recognition technologies.
Similar concerns of automated processing and sensitive categories of data were later repeated in La Quadrature du Net,72 and in Ligue des Droits Humains,73 respectively. In these cases, the court was more explicit with proportionality balancing forming an additional step after the assessment of strict necessity, adding some needed clarity in terms of its view on the process of proportionality analysis. Again, the court in these cases analysed the strict necessity in relation to legislative frameworks derogating from the right to data protection, this time surrounding the access of law enforcement agencies to communications data in La Quadrature du Net, and passenger name records from airlines in Ligue des Droits Humains.74

3.2.3 Some Preliminary Observations on Absolute Necessity and Proportionality

Summing up the case law discussed so far, it seems quite clear that the ECJ views strict necessity as an overall requirement for derogations in relation to the rights of data protection and privacy. However, as this analysis also shows, the application of that requirement seems most consistent whenever the court have been asked to consider the overall legislative framework establishing such derogations. In these cases, the court has also moved towards explicitly engaging in proportionality balancing as an additional step after analysing the strict necessity of the legislative framework allowing for restrictions of the right to data protection. In the few cases where the court expounded on the specific processing of data within such frameworks prior to Criminal procedures against V.S., the court has similarly made mentions of the strict necessity requirement, but applied it somewhat differently, mainly as a dictum to narrowly interpret the specific rules allowing for such derogation. For quite some time this suggested the possibility that the two-pronged necessity requirements of the LED, separating ‘ordinary’ from ‘strict’ necessity could have held in the face of ECJ scrutiny, through the distinction between legislative measures that derogate from the right to data protection, which the ECJ requires to meet the threshold of strict necessity, and specific measures taken within the scope of such legislation, such as the LED and its national transposition the necessity requirement would still depend on the category of data. In the latter context the requirement would primarily have been a call for a narrower interpretation in light of the rights of the Charter. We now know that the ECJ instead chose a different approach in Criminal procedures against V.S. and instead elevated ‘strict necessity’ in the context of Article 10 LED to what amounts to ‘absolute necessity’. Still, while the ECJ ruling answered some questions it created a few new ones as well.
A first question arises from the fact that the ruling in V.S. did not contain any of the usual references to earlier case law establishing the requirement of strict necessity in relation to restrictions of the right to data protection. This is noteworthy, as references to for example Satakunnan; Volker und Markus Schecke and Eifert; and Digital Rights Ireland, have become something of a formula when the court applies necessity in the data protection context. Also, the court never explicitly mentions the overall requirement of strict necessity repeated in those cases, choosing instead to differentiate between those articles in the LED referencing only ‘necessity’ from that of ‘strict necessity’ in Article 10 requiring strengthened conditions of scrutiny.75 This may be explained by how these often-mentioned cases relate to the DPD and the GDPR rather than the LED, and the need to separate the different contexts of processing. Still, in doing so, the ECJ missed an opportunity to clearly establish that (or at least clarify whether) the requirement of strict necessity does in fact carry across to the LED context outside of the scope of Article 10. However, one could argue that by establishing that ‘strict necessity’ in Article 10 should, in fact, be construed as ‘absolute necessity’, the ECJ implicitly suggest the need for something stricter than ‘strict necessity’—a need that reasonably arises due to the overall requirement of strict necessity in the data protection context.
A second question arises from how the ECJ avoided explicit mentions of proportionality balancing, instead opting for incorporating the weight of the purpose underpinning processing of sensitive data under the heading of necessity. This constitutes a break from the trend established in ECJ case law ever since the Tele2 ruling. A possible explanation is that given how the court is acting not on the legislative level, but within the framework of the LED, the court maintained its more cautious approach by incorporating some concerns that traditionally have fallen under proportionality stricto sensu into the requirements under Article 10, rather than adding proportionality as a separate concern based on the application of Charter rights extraneous to the LED. While such an application of proportionality based on fundamental rights would carry additional weight, it is also something the court has generally preserved for cases where it assesses general legislative frameworks derogating from the right to data protection, rather than interpreting specific rules in secondary EU law. Also, by incorporating these considerations into the requirements included in Article 10, this requirement is likely to assert itself further in the application of this article within law enforcement agencies that may be less inclined to consider the need for overall proportionality assessment under fundamental rights documents.

3.3 Implications of Necessity

3.3.1 Necessity and Least Restrictive Means

Having established the standard of necessity required, we may now move to consider the actual implications of necessity, strict necessity, and absolute necessity, in the data protection context and in relation to the use of AI as decision-support tools.
To do this, we need to further consider what the necessity test implies. As suggested by the name of the test, it requires that a limitation of a right can in fact be proven to be necessary to reach the aim. The intention is to sort out unnecessarily intrusive measures, without having to resort to more complicated and controversial balancing exercises as part of proportionality in the strict sense.76 Implicit in this necessity analysis is also the least restrictive means test, which includes an analysis of whether alternative means exist which still contribute effectively to the intended aim but which would restrict the right to a lesser extent.77 If we can find such alternative means this is a clear indication that necessity is not met.
One example of where the ECJ has found a violation of necessity through the least restrictive means test concerned the public disclosure of penalty points for road traffic offences in Latvia. Asked to consider whether this system complied with the right to data protection, the ECJ compared the Latvian legislation to alternatives in other member states using less privacy sensitive preventive measures such as public awareness campaigns or driving tests to reach the same goal. Such measures did not carry the same risk of social disapproval and stigmatisation of the data subject, yet according to the ECJ there was no indication that the Latvian legislature had considered them. As such, the court found that the Latvian system was not strictly necessary.78
In other cases, such as those relating to data retention and access to communications data by law enforcement agencies, the analysis of necessity is analytically somewhat intermingled with more overarching proportionality concerns, but certain parts of the ECJ jurisprudence in this context are clear expressions of necessity and questions of least restrictive means. For example, the retention of all communications data, for all users of communications networks, goes beyond what is necessary for the prevention of serious crimes or the protection of national security.79 As such, some type of criteria must be implemented that implies a possible link between the affected individuals and the crimes the measure is intended to prevent—otherwise it will impact a wider set of individuals than is necessary. The associated proportionality aspect of this equation is that the significant impact of these measures for the rights in question can only be motivated by the fight against serious crimes or the protection of national security.80
It is not entirely clear how this requirement would translate into the facial recognition context. The Swedish police authority has stressed that searching through video footage in preliminary investigations will necessitate a scan of all faces that appear in the footage to identify an individual of interest to the investigation.81 In other words, the scan is conducted precisely to identify the possible link between an individual and the crime. To some extent this scan will, in the case of preliminary investigations, likely relate to a specific location relevant to the investigation of a crime. This could imply that there is a limit to the affected individuals and that at least a possible link exists between the person, through the location, to a possible crime. Geographical limits have been highlighted by the ECJ as a possible measure to make communications data retention conform to the necessity requirement of the Charter.82 However, in situations where the crime being investigated through the use of facial recognition technology has been committed in a public space with larger crowds present, such as in the case of riots occurring in the context of a political protest, this limitation is rendered less functional. In such situations, the processed images may also reveal political opinions of participants in the demonstration which adds further concerns to the existing processing of biometric data.83
It should be said that the way in which the ECJ has expressed the least restrictive means test adds a certain note of confusion to the relationship between ordinary and strict necessity. In Proceedings brought by B, the ECJ applied a test established in the discussion of ‘regular’ necessity in recital 39 of the GDPR to explain the implications for the strict necessity requirement emanating from the Charter. This requirement, the ECJ held:
is not met where the objective of general interest pursued can reasonably be achieved just as effectively by other means less restrictive of the fundamental rights of data subjects, in particular the rights to respect for private life and to the protection of personal data guaranteed in Articles 7 and 8 of the Charter, since derogations and limitations in relation to the principle of protection of such data must apply only in so far as is strictly necessary […]84
This may indicate that in terms of the least restrictive means test, the threshold is similar under the two levels of necessity. In other words, the standard of just as effectively for comparisons to other potential measures that could fulfil part of the legislative aim, is the same. If so, the level of strictness or scrutiny that distinguishes necessity from strict necessity would apply primarily to the care with which courts and public authorities are expected to perform their review of necessity and the safeguards surrounding the measure. Essentially it would limit the margin of appreciation permitted to the law enforcement authority in the necessity test to a minimum.

3.3.2 Effectiveness or Efficiency: Two Sides of the Same Coin?

The question of whether the standard of just as effectively within the test of the least restrictive means can present a functional safeguard against unnecessary restrictions of the right to data protection will, ultimately, depend on the closer definition of effectively. A particular concern in this context is that moving from necessity to effectiveness may invite another subtle shift, whereby effectiveness is confused with efficiency.
We can find an example of this conceptual confusion in our example of the decision by the Swedish DPA, as it highlighted operational efficiency—as in cost/time efficiency—in its opinion on the Swedish police use of facial recognition, stating that the use of the system would be ‘more effective than traditional and time-consuming manual analysis by human investigators’.85 In other words, the question was not whether the measure was essential to achieving the public interest, but rather that it was a more efficient way of reaching the same result.86
In contrast, the EDPS has in its toolkit to assess the necessity of measures limiting the fundamental right to protection of personal data, stressed that convenience or cost effectiveness is not sufficient to reach the threshold of necessity.87 The EDPS expresses that necessity requires a measure to be ‘genuinely effective, i.e. essential to achieve the objective of general interest pursued’.88 Not only that, the toolkit holds that ‘[i]f the proposed measure includes the processing of sensitive data, a higher threshold should be applied in the assessment of effectiveness’.89 Questions of operational efficiency, such as the saving of resources, are issues the toolkit instead highlights as part of the proportionality analysis, as it is a question that ‘requires the balancing with other competing interests of public interest’.90
This analysis of the EDPS finds support in the theoretical literature on necessity as part of an overall proportionality assessment as well. Aharon Barak, for instance, holds that questions of costs are questions dealt with under the proportionality stricto sensu step of analysis.
Whenever the new means, whose limitation of the constitutional right is of a lesser extent, require additional expense, we can no longer conclude that the means originally chosen are not necessary. […] The issue, therefore, is whether the state’s choice of avoiding the additional expense in order to prevent the further limitation of a human right is constitutional. The necessity test cannot assist us in attempting to resolve this issue; indeed this discussion should be conducted within the framework of proportionality stricto sensu, which is based on balancing.91
As previously mentioned, there has been a move towards more explicitly acknowledging the need proportionality balancing by the ECJ in the data protection context, which could contain these matters of efficiency as well. However, it could be argued that the ECJ has recently adopted an even stricter approach to necessity and operational efficiency.
In Case C-184/20 OT, the ECJ was asked to consider the publication of personal data of persons in charge of establishments receiving public funds, in a publicly accessible database on the Lithuanian Chief Ethics Commission’s website. The publication of this data was intended to allow for discovery of conflicting interests and combat corruption in the public sector. The disclosure forms published contained information about the declarant’s spouses, cohabitees, or partners, as well as information about presents received and transactions between partners, which may reveal sensitive personal characteristics, including sexual orientation.92 In its preliminary ruling, the ECJ focused extensively on the necessity in terms of whether the measure was the least restrictive means to reach the aim. This, the court held, was a question that had to be assessed ‘in the light of all the matters of fact and law specific to the Member State concerned’.93
One part of this discussion is particularly salient in this context. On the question of why the information in the database had to be available to the public rather than used only by anti-corruption authorities, the Lithuanian government had argued that the state it did not have sufficient human resources to check effectively all the declarations that were submitted to it.94 Essentially, the Chief Ethics Commission counted on a form of crowdsourcing to allow for a more efficient use of human resources. The ECJ response to this approach was blunt and to the point.
However, it must be pointed out that a lack of resources allocated to the public authorities cannot in any event constitute a legitimate ground justifying interference with the fundamental rights guaranteed by the Charter.95
On its face, this statement by the ECJ could essentially exclude concerns of efficacies of human resources or cost saving measures to underpin a claim of necessity. If so, this could carry far-reaching consequences for many different applications of technological methods of analysis, as many are implemented as cost-saving measures, or to reduce dependencies on limited human resources.
A more cautious reading may however be warranted. In the context of policing, as well as in other sectors, many cost-saving measures can plausibly be reframed as effectiveness-measures. As an example, the police may rightfully argue that the use of facial recognition is the only effective way to quickly identify potential perpetrators in crime-scene footage. However, implicit in this argument hides the fact that the police will always be resource constrained in comparison to its many tasks, with an associated need to prioritise resources.96 The argument can thus be reframed; the only way to quickly identify potential perpetrators within existing resource parameters, may be to use facial recognition technology. This reframing does not however answer the question of whether this identification is accurate, or forensically valid,97 which ultimately will determine whether the method is in fact effective.
Furthermore, in the context of automating certain tasks, comparisons to manual processing may also invite arguments relating to tensions that may exist between different principles and values within the LED as such. One example can be found in the Swedish Police Authority’s opinion on the previously mentioned proposed EDPS guidelines on the use of facial recognition technology in the area of law enforcement.98 First, the Swedish Police Authority opined that the implications of the EDPS’s interpretation of ‘strictly necessary’ would be that the scope to use facial recognition technology would be limited to an absolute minimum. This would, the authority pointed out, ‘conflict with efforts of lawmakers and authorities to use the possibilities offered by technology to more effectively prevent and investigate crime and in doing so protect people’s fundamental rights and freedoms’.99 The Authority also pointed to secondary effects of manual processing:
In the Swedish Police Authority’s view, it should be taken into consideration that the alternative in certain cases to using software for image analysis to assist in search and analysis, for example, may instead involve a large number of officers going through huge quantities of images manually. Even if it is possible in theory (in some cases), this would mean needing to actually access a large number of irrelevant images and would require the images to be saved for a considerably longer period of time, which is difficult to reconcile with the general principles in data protection regulations.100
This argument is interesting as the use of facial recognition also requires access (by automated means) to a large number of irrelevant images, as well as biometric processing of all photos included in a database intended for comparison. Still, these arguments illustrate how issues of cost-efficiency can be restated as matters of effectiveness in relation to more overarching aims of crime investigations, such as the positive obligations of the state in safeguarding fundamental rights through the protection of life through effective measures to detect, prevent, and investigate crime.
Almost every part of policing, regardless of member state, is affected by the need to prioritise limited resources. The drive to implement technological systems will inevitably take place against this backdrop. As such, the impact of the statement by the ECJ in OT is still uncertain, but it is clear in the sense that arguments of cost-efficiency cannot singlehandedly justify limitations of fundamental rights under the umbrella of necessity. However, systems that can be argued to also improve effectiveness—reaching the aim faster, or with better accuracy—while still having positive effects for cost-efficiency—are likely to be accepted under this necessity analysis and need to be analysed further, through balancing stricto sensu.

4 Conclusions: Necessity and AI-Supported Decision-Making Under the Law Enforcement Directive

The recent case law from the ECJ has added much needed clarifications on necessity under the LED (see Fig. 1). While there are reasons to believe that EU data protection law can provide meaningful limits to the use of AI in the law enforcement settings, the conflation of effectiveness and efficiency is likely to remain a risk. This is due to the way the requirement of necessity has been expressed in the LED and the GDPR, not only in the context of sensitive data—with no mention of proportionality although such a requirement is implicit in the limitation of fundamental rights. As we have seen, there are increasingly explicit mentions of proportionality balancing in ECJ rulings on necessity under data protection frameworks, as well as in the current guidelines from the EDPB. This is in line with how proportionality theory stress that necessity is one (important) threshold requirement, before moving onto proportionality balancing. Still, the absence of proportionality in the relevant sections of the LED and the GDPR may—as we can see in our example from the Swedish DPA—lead to proportionality balancing potentially becoming lost in practice. This highlights the need for DPAs and courts to continuously acknowledge that ‘human rights cost money’,101 and guard against strict necessity devolving into simple cost effectiveness.
When the ECJ baked in certain aspects of proportionality balancing into the analysis of necessity under Article 10 LED in Criminal proceedings against V.S., it may have countered that risk to some extent. It is a clear signal that the LED is not immune to proportionality considerations. It is also likely that the new threshold of ‘absolute necessity’ established in V.S. implies the need for a much stricter scrutiny of necessity, particularly in connection with the emphasis on data minimisation. However, the emphasis the ECJ placed on how processing of sensitive data should be a very clear exception rather than commonplace is likely to conflict with the ambitions of many member states in the law enforcement context.102 Through the ruling in OT, the ECJ has also highlighted how considerations of lacking resources cannot itself be enough to warrant restrictions of fundamental rights. Taken together, the rulings in V.S. and OT have planted seeds that hold the potential to grow into effective legal limits for the implementation of AI systems that impact the fundamental right to data protection, particularly in contexts where sensitive categories of data are used. However, to what extent those seeds will grow depends on the extent to which DPAs and courts engage in critically examining the purported benefits of AI-supported decision-making.
Looking at the development of the principle of necessity in ECJ case law, it seems clear that it has, over time, become a focal point of the court’s reasoning in data protection cases. Simultaneously, the primary source of interpretation has shifted towards its fundamental rights foundations rather than its expression in secondary law. This is significant as it mirrors how the ECJ over time have emphasised data protection rights in the face of large-scale data processing and surveillance measures established in secondary law (such as in Digital Rights Ireland and Tele2) or international agreements (such as in Schrems and in Opinion 1/15). While the ECJ has not been entirely consistent, it is still clear that the interpretation of the requirements of the LED will need to be taken against this broader backdrop of fundamental rights.
A lot of attention as of late has been placed on the prospects of the upcoming AIA. However, the developments outlined here shows that data protections concerns are likely to remain highly relevant in relation to AI use by law enforcement even as the AIA arrives on the scene. There are several strengths brought by this framework as compared to the AIA. First, unlike the AIA, the data protection framework is largely technology agnostic and is not dependent on establishing the presence of a specific AI-technology. Second, data protection can come into play at the earlier stages of data gathering as a form of processing, rather than through the deployment of an AI-system. Third, it will remain relevant across the different categories of risk proposed in the AIA. While the AIA is likely to express that some uses of AI technologies, such as live ‘remote’ facial recognition of public spaces should be forbidden (at least as a main rule),103 it should be noted that those same uses are likely to run afoul of the current data protection principles as well. It is in any case fair to say that the strength of the current data protection framework has not yet been fully tested in relation to emerging AI technologies, but it has the potential to act as a meaningful safeguard against the risks of such systems.
In another sense, however, the developments outlined in this contribution might be able to tell us something of what to expect of future interpretation of the AIA. As many of the articles of the act is built upon foundations found in fundamental rights such as the respect for privacy and human dignity, we should not be surprised if those foundations will slowly begin to emerge and influence the interpretation of the act as well. Perhaps it is through this process, rather than in any specific statutory addition to the current data protection framework that the real potential of the AIA will be located.

Acknowledgements

This work was supported by the Swedish Research Council under Grant number 2020-02278. The author would like to thank Therese Enarsson, Johan Lindholm, Lena Enqvist, Jan Leidö, Isak Nilsson, and Felix Bockel, for comments on an earlier draft of this paper.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://​creativecommons.​org/​licenses/​by/​4.​0/​), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Fußnoten
1
Schartum (2020), Suksi (2021), and Ranerup and Henriksen (2020).
 
2
See, for example, Directive (EU) 2016/681 of the European Parliament and of the Council of 27 April 2016 on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terrorist offences and serious crime.
 
3
Enarsson et al. (2022).
 
4
See, among other examples; Ferguson (2017), Brayne (2021), Harcourt (2007), Naarttijärvi (2019), and Joh (2016).
 
5
See European Commission (2021) Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain union legislative acts.
 
6
(2016) Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA.
 
7
Kotsoglou and Oswald (2020), p. 87.
 
8
See Hildebrandt (2010), pp. 121–123.
 
9
See, for instance, Afra and Alhajj (2020).
 
10
See (2022) Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts—Preparation for Coreper recital 48.
 
11
Moraes et al. (2021), p. 165; see also European Data Protection Board, Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, 26 April 2023, para 41.
 
12
See Zalnieriute (2021), p. 301.
 
13
European Data Protection Board, Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, 26 April 2023, para 41.
 
14
Ibid., paras 36–38.
 
15
Ibid., para 39.
 
16
Both the European Data Protection Board and the European Data Protection Supervisor has however argued for a ban on automated recognition of human features in publicly accessible spaces, see EDPB-EDPS, Joint Opinion 5/2021 on the proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act), 18 June 2021, paras 30–33.
 
17
See European Data Protection Board, Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, 26 April 2023, para 105.
 
18
See Sect. 2.3 below.
 
19
Polisdatalagen (‘The Police Data Act’) Ch. 3, § 7, 2 para.
 
20
Datainspektionen (2019) Förhandssamråd om Polismyndighetens planerade användning av. programvara för ansiktsigenkänning mot signalementsregistret.
 
21
ECJ, Heinz Huber v Bundesrepublik Deutschland, Case C-524/06, Judgment of 16 December 2008.
 
22
Datainspektionen (2019) Förhandssamråd om Polismyndighetens planerade användning av. programvara för ansiktsigenkänning mot signalementsregistret.
 
23
Swedish Police Authority (2022) Nytt forensiskt standardförfarande driftsatt. In: polisen.se. https://​polisen.​se/​aktuellt/​nyheter/​2022/​februari/​nytt-forensiskt-standardforfaran​de-driftsatt/​. Accessed 2 March 2023.
 
24
Lag (2018:1693) om polisens behandling av. personuppgifter inom brottsdatalagens område, Ch. 6, § 4.
 
25
Ibid. Ch. 6, § 1.
 
26
The specific reference to necessity made by the DPA was to Ch 2, § 1, Brottsdatalag, which applies more generally to the processing of personal data within the police, rather than the specific rule on biometric data within forensic databases.
 
27
ECJ, Heinz Huber v Bundesrepublik Deutschland, Case C-524/06, Judgment of 16 December 2008.
 
28
See below Sect. 3.
 
29
See below Sect. 3.1.2.
 
30
See Sect. 3.2.2 below.
 
31
Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data and repealing Directive 95/46/EC (General Data Protection Regulation).
 
32
See recital 10 LED.
 
33
Sajfert and Quintel (2017), pp. 21–22.
 
34
Marquenie (2017), p. 328.
 
35
Ibid., p. 329.
 
36
Bygrave (2020), p. 533; Article 29 Data Protection Working Party (2017) Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, pp. 20–21; Brkan (2019), pp. 101–102.
 
37
The ECJ have begun asserting the fundamental rights implications of automated decision-making, see for example ECJ, Ligue des droits humains ASBL v Conseil des ministres, Case C-817/19, Judgment of 21 June 2022.
 
38
(2022) Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts—Preparation for Coreper.
 
39
Defined in LED Article 10 as personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation.
 
40
See ECJ, Tietosuojavaltuutettu v Satakunnan Markkinapörssi Oy and Satamedia Oy, Case C-73/07, Judgment of 16 December 2008.; ECJ, Volker und Markus Schecke GbR (C-92/09) and Hartmut Eifert (C-93/09) v Land Hessen, Joined cases C-92/09 and C-93/09, Judgment of 9 November 2010.; ECJ, Tele2 Sverige AB v Post- och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others, Joined Cases C-203/15 and C-698/15, Judgment of 21 December 2016, p. 2.
 
41
European Data Protection Supervisor, Assessing the necessity of measures that limit the fundamental right to the protection of personal data: A Toolkit, 11 April 2017, p. 7.
 
42
Article 29 Data Protection Working Party, Opinion on some key issues of the Law Enforcement Directive (EU 2016/680), WP 258, 29 November 2017, pp. 7–8.
 
43
ECJ, Criminal proceedings against VS, Case C-205/21, Judgment of 26 January 2023.
 
44
Ibid., paras 114–115.
 
45
Ibid., para 116.
 
46
Ibid., para 117.
 
47
Ibid., para 118.
 
48
Ibid., paras 119–120.
 
49
European Data Protection Supervisor (2022) Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement.
 
50
Ibid., para 73.
 
51
Ibid., para 73.
 
52
ECJ, Criminal proceedings against VS, Case C-205/21, Judgment of 26 January 2023, para 127.
 
53
ECJ, Tietosuojavaltuutettu v Satakunnan Markkinapörssi Oy and Satamedia Oy, Case C-73/07, Judgment of 16 December 2008, para 56.
 
54
Ibid., para 56.
 
55
ECJ, Volker und Markus Schecke GbR (C-92/09) and Hartmut Eifert (C-93/09) v Land Hessen, Joined cases C-92/09 and C-93/09, Judgment of 9 November 2010.
 
56
Ibid., para 77.
 
57
Council Regulation (EC) No. 1290/2005 of 21 June 2005 on the financing of the common agricultural policy (Article 44a).
 
58
ECJ, Institut professionnel des agents immobiliers (IPI) v Geoffrey Englebert and Others, Case C-473/12, Judgment of 7 November 2013, para 39.
 
59
Ibid., paras 14–21.
 
60
ECJ, Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others and Kärntner Landesregierung and Others, Joined Cases C-293/12 and C-594/12, Judgment of 8 April 2014, para 52.
 
61
ECJ, František Ryneš v Úřad pro ochranu osobních údajů, Case C-212/13, Judgment of 11 December 2014.
 
62
Ibid., paras 13–18.
 
63
Ibid., para 28.
 
64
ECJ, Maximillian Schrems v Data Protection Commissioner, Case C-362/14, Judgment of 6 October 2015, paras 91–93.
 
65
ECJ, Tele2 Sverige AB v Post- och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others, Joined Cases C-203/15 and C-698/15, Judgment of 21 December 2016.
 
66
Ibid., para 96.
 
67
Barak (2012).
 
68
ECJ, Tele2 Sverige AB v Post- och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others, Joined Cases C-203/15 and C-698/15, Judgment of 21 December 2016, para 97.
 
69
Ibid., para 102.
 
70
ECJ, Opinion of the Court (Grand Chamber) of 26 July 2017 — Draft agreement between Canada and the European Union. Case Opinion 1/15., Judgment of 2015, para 141.
 
71
Ibid., para 141.
 
72
ECJ, La Quadrature du Net and Others v Premier ministre and Others, Joined Cases C-511/18, C-512/18, C-520/18, Judgment of 6 October 2020, para 130.
 
73
ECJ, Ligue des droits humains ASBL v Conseil des ministres, Case C-817/19, Judgment of 21 June 2022, para 115.
 
74
ECJ, La Quadrature du Net and Others v Premier ministre and Others, Joined Cases C-511/18, C-512/18, C-520/18, Judgment of 6 October 2020; ECJ, Ligue des droits humains ASBL v Conseil des ministres, Case C-817/19, Judgment of 21 June 2022; see also ECJ, GD v The Commissioner of the Garda Síochána and Others, Case C-140/20, Judgment of 5 April 2022, paras 52–54.
 
75
ECJ, Criminal proceedings against VS, Case C-205/21, Judgment of 26 January 2023, para 117.
 
76
Barak (2012), p. 338.
 
77
ECJ, Volker und Markus Schecke GbR (C-92/09) and Hartmut Eifert (C-93/09) v Land Hessen, Joined cases C-92/09 and C-93/09, Judgment of 9 November 2010, para 86. Barak has divided this test into two elements, (1) whether there is a hypothetical alternative means which equally advances the law’s purpose, and (2) whether there is a hypothetical alternative means which limits the constitutional right to a lesser extent, see Barak (2012), pp. 323–328.
 
78
ECJ, Proceedings brought by B, Case C-439/19, Judgment of 22 June 2021, paras 111–113.
 
79
ECJ, Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others and Kärntner Landesregierung and Others, Joined Cases C-293/12 and C-594/12, Judgment of 8 April 2014, paras 56–65; ECJ, Tele2 Sverige AB v Post- och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others, Joined Cases C-203/15 and C-698/15, Judgment of 21 December 2016, paras 105–108.
 
80
ECJ, Tele2 Sverige AB v Post- och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others, Joined Cases C-203/15 and C-698/15, Judgment of 21 December 2016, para 115.
 
81
Swedish Police Authority (2022) Opinion from the Swedish Police Authority concerning the EDPB’s Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, p. 5.
 
82
ECJ, Tele2 Sverige AB v Post-och telestyrelsen and Secretary of State for the Home Department v Tom Watson and Others, Joined Cases C-203/15 and C-698/15, ECLI:EU:C:2016:970, Judgment of 21 December 2016, para 106; ECJ, Digital Rights Ireland Ltd v Minister for Communications, Marine and Natural Resources and Others and Kärntner Landesregierung and Others, Joined Cases C-293/12 and C-594/12, Judgment of 8 April 2014, para 59.
 
83
European Data Protection Board, Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, 26 April 2023, p. 45.
 
84
ECJ, Proceedings brought by B, Case C-439/19, Judgment of 22 June 2021, para 110. There are some subtle differences in how the Court has expressed this requirement in different cases, here it was ‘just as effectively’ but in TK v Asociaţia de Proprietari bloc M5A-ScaraA [2019] ECJ Case C-708/18, which was the case the Court referenced as support for this interpretation, the Court expressed it as ‘cannot reasonably be as effectively achieved by other means less restrictive of the fundamental rights and freedoms of data subjects’. While unclear, the addition of ‘just as’ in Proceedings brought by B may indicate a stricter interpretation.
 
85
Datainspektionen (2019) Förhandssamråd om Polismyndighetens planerade användning av. programvara för ansiktsigenkänning mot signalementsregistret.
 
86
One explanation for this may be that the Swedish word for effectiveness is the same as for efficiency (‘effektivitet’). The translation of the requirement of comparable effectiveness in ECJ case law into Swedish may in other words blur the distinction the EDPS wants to maintain against efficiency.
 
87
European Data Protection Supervisor, Assessing the necessity of measures that limit the fundamental right to the protection of personal data: A Toolkit, 11 April 2017, p. 17.
 
88
Ibid., p. 16.
 
89
Ibid., p. 17.
 
90
Ibid., p. 17.
 
91
Barak (2012), p. 326.
 
92
ECJ, OT v Vyriausioji tarnybinės etikos komisija, Case C-184/20, Judgment of 1 August 2022.
 
93
Ibid., para 86.
 
94
Ibid., para 88.
 
95
Ibid., para 89.
 
96
Landström et al. (2020).
 
97
Kotsoglou and Oswald (2020).
 
98
The proposed guidelines are still in the public consultation phase.
 
99
Swedish Police Authority (2022) Opinion from the Swedish Police Authority concerning the EDPB’s Guidelines 05/2022 on the use of facial recognition technology in the area of law enforcement, pp. 4–5.
 
100
Ibid., p. 5.
 
101
Barak (2012), p. 326; Alexy (2002), p. 400.
 
102
The Swedish government initiated a public inquiry in May 2021 that is to propose new rules holding that more people suspected of crimes must be able to be identified using fingerprints, DNA, facial images, or similar information about individual characteristics. The inquiry will deliver its proposal in June, 2023. Justitiedepartementet Biometri i brottsbekämpningen.
 
103
(2022) Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts—Preparation for Coreper. Recital 18 & 19.
 
Literatur
Zurück zum Zitat Alexy R (2002) A theory of constitutional rights. Oxford University Press, Oxford Alexy R (2002) A theory of constitutional rights. Oxford University Press, Oxford
Zurück zum Zitat Barak A (2012) Proportionality: constitutional rights and their limitations Barak A (2012) Proportionality: constitutional rights and their limitations
Zurück zum Zitat Brayne S (2021) Predict and surveil: data, discretion, and the future of policing. Oxford University Press, New York Brayne S (2021) Predict and surveil: data, discretion, and the future of policing. Oxford University Press, New York
Zurück zum Zitat Bygrave LA (2020) Article 22 automated individual decision-making, including profiling. In: The EU General Data Protection Regulation (GDPR). Oxford University Press Bygrave LA (2020) Article 22 automated individual decision-making, including profiling. In: The EU General Data Protection Regulation (GDPR). Oxford University Press
Zurück zum Zitat Ferguson AG (2017) The rise of big data policing: surveillance, race, and the future of law enforcement. NYU Press, New YorkCrossRef Ferguson AG (2017) The rise of big data policing: surveillance, race, and the future of law enforcement. NYU Press, New YorkCrossRef
Zurück zum Zitat Harcourt BE (2007) Against prediction: profiling, policing, and punishing in an actuarial age. University of Chicago Press, Chicago Harcourt BE (2007) Against prediction: profiling, policing, and punishing in an actuarial age. University of Chicago Press, Chicago
Zurück zum Zitat Hildebrandt M (2010) Proactive forensic profiling: proactive criminalization? In: Duff RA, Farmer L, Marshall SE et al (eds) The boundaries of the criminal law. Oxford University Press, Oxford Hildebrandt M (2010) Proactive forensic profiling: proactive criminalization? In: Duff RA, Farmer L, Marshall SE et al (eds) The boundaries of the criminal law. Oxford University Press, Oxford
Zurück zum Zitat Joh EE (2016) The new surveillance discretion: automated suspicion, big data, and policing. Harv Law Policy Rev 10:15–42 Joh EE (2016) The new surveillance discretion: automated suspicion, big data, and policing. Harv Law Policy Rev 10:15–42
Zurück zum Zitat Sajfert J, Quintel T (2017) Data Protection Directive (EU) 2016/680 for Police and Criminal Justice Authorities. SSRN Scholarly Paper ID 3285873. Accessed 10 February 2022 Sajfert J, Quintel T (2017) Data Protection Directive (EU) 2016/680 for Police and Criminal Justice Authorities. SSRN Scholarly Paper ID 3285873. Accessed 10 February 2022
Zurück zum Zitat Schartum DW (2020) From legal sources to programming code: automatic individual decisions in public administration and computers under the rule of law. In: Barfield W (ed) The Cambridge handbook of the law of algorithms, 1st edn. Cambridge University Press, pp 301–336CrossRef Schartum DW (2020) From legal sources to programming code: automatic individual decisions in public administration and computers under the rule of law. In: Barfield W (ed) The Cambridge handbook of the law of algorithms, 1st edn. Cambridge University Press, pp 301–336CrossRef
Metadaten
Titel
AI and Sensitive Personal Data Under the Law Enforcement Directive: Between Operational Efficiency and Legal Necessity
verfasst von
Markus Naarttijärvi
Copyright-Jahr
2024
DOI
https://doi.org/10.1007/16495_2023_57