Sweden’s Authority for Privacy Protection Levies Fine Against Swedish Police Authority for Improper Use of Facial Recognition Software

By Simonne Kapadia

In February 2021 the Swedish Authority for Privacy Protection (Integrationsmyndigheten or “IMY”) issued a fine of nearly $300,000 USD to the Swedish Police Authority for its use of a facial recognition software known as Clearview AI.[1] IMY issued the hefty fine on the grounds that the Police Authority’s use of the software violated provisions in Sweden’s Criminal Data Act.[2] The Police Authority used the Clearview software for a relatively short period of time to assist with identifying perpetrators and victims of serious crimes before media reports prompted IMY to begin its investigation into whether the Police Authority’s use of the software was appropriate.[3]

The Clearview AI software allows a user to upload an unidentified person’s image to the Clearview application.[4] The application, in turn, cross-references biometric markers in the uploaded image to those in publicly available images on the internet and returns to the user links to possible matches between the uploaded photo and internet search results.[5] The Swedish Criminal Data Act defines “biometric data” as “[p]ersonal data relating to . . . physical, psychological or behavioral characteristics . . . which enables or confirms unique identification of persons.”[6] In addition to the recent IMY decision in Sweden, use of the Clearview software has raised privacy concerns in other jurisdictions such as Germany, the United Kingdom, Australia, and the United States.[7] Specific concerns regarding the use of Clearview’s software include, among others, the software’s susceptibility to abuse by individuals with access to it and that the accuracy of the software has not been independently verified.[8]

 In deciding that the Police Authority violated the Swedish Criminal Data Act, IMY reasoned that under the Criminal Data Act, and other statutory authority, that the Police Authority as a whole is responsible for data misuse conducted by individual members of its staff.[9] Thus, although the Police Authority claimed that only a few of its members used the Clearview software improperly, the Police Authority is still wholly liable for individual staff members’ misuse of data in the course of the staff members’ carrying out their investigative responsibilities.[10] Additionally, IMY reasoned that the Police Authority’s use of the Clearview software violated, inter alia, Chapter 2, § 12 of the Criminal Data Act, which provides that “[b]iometric . . . data may only be processed if it is specifically prescribed and it is absolutely necessary for the purpose of the processing”; Chapter 2, § 4 of the Act, which requires there to be (1) “a legal basis” for the use of personal data and (2) that the mechanism of using the personal data is “necessary and proportionate” to the purpose of using the data; and Chapter 3, § 7 of the Act, which requires, in relevant part, “data controller[s]” to conduct an “impact assessment” that weighs the risks associated with implementing “a new type of [data] processing.”[11] IMY reasoned that because the database that Clearview uses to cross-reference uploaded images contains such a vast array of publicly available images that it is unlikely to conform to the narrow necessity requirements of the Criminal Data Act and that the Police Authority impermissibly failed to conduct an impact assessment prior to using the Clearview software.[12]

[1] Sweden: Privacy Protection Authority Finds Police Use of Facial Recognition Illegal, Issues Fines, Library of Congress (Apr. 6, 2021), https://loc.gov/law/foreign-news/article/sweden-privacy-protection-authority-finds-police-use-of-facial-recognition-illegal-issues-fines/ (stating that in addition to the nearly $300,000 USD fine, the Police Authority must conduct internal training on data privacy and ensure that the Police Authority inform individuals whose images were uploaded to Clearview and ensure that Clearview removes information it gathered from the Policy Authority) [hereinafter Swedish Privacy Protection Authority Issues Fines].

[2] Id.; See also Brottsdatalag [Criminal Data Act] (Svensk författningssamling [SFS] 2018:1177) (Swed.).

[3] Beslut efter tillsyn enligt brottsdatalagen – Polismyndighetens användning av Clearview AI [Decision After Supervision According to the Criminal Data Act – The Police Authority’s Use of Clearview AI], Integrationsmyndigheten [IMY] 2021 p.1-2 DI-2020-2719, https://perma.cc/JDC2-48A5 [hereinafter IMY Facial Recognition Decision] (advising that the Police Authority only used the Clearview software between fall 2019 and early March 2020).

[4] Id.

[5] Id.

[6] 1 ch. 6 § (SFS 2018:1177).

[7] Natasha Lomas, Sweden’s Data Watchdog Slaps Police for Unlawful Use of Clearview AI, Tech Crunch (Feb. 12, 2021, 5:21 AM), https://techcrunch.com/2021/02/12/swedens-data-watchdog-slaps-police-for-unlawful-use-of-clearview-ai/ (advising that Germany and the United Kingdom have raised concerns about the Clearview software); Kashmir Hill, The Secretive Company that Might End Privacy as We Know It, N.Y. Times (last updated March 18, 2021), https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html (detailing concerns in the United States about the Clearview software).

[8] Hill, supra note 7.

[9] IMY Facial Recognition Decision, supra note 3 at 3.

[10] Id.

[11] Id.; 2 ch. 4, 12 §§ (SFS 2018:1177); 3 ch. 7 § (SFS 2018:1177).

[12] IMY Facial Recognition Decision, supra note 3 at 4, 5.

MSU ILR