11.01.2022

|

Updates

After more than two years of litigation, Amazon and Microsoft won summary judgment in two class action lawsuits asserting violations of the Illinois Biometric Information Privacy Act (BIPA): Vance v. Amazon.com, Inc., Case No. C20-1084JLR and Vance v. Microsoft Corp., Case No. C20-1082JLR. Both decisions were issued by Judge James L. Robart of the U.S. District Court for the Western District of Washington on October 17, 2022.

The plaintiffs in both cases, Steve Vance and Tim Janecyk, claimed that they uploaded photographs to the photo-sharing website Flickr between 2004 and 2014, and that some of those photographs were included in the “Diversity in Faces” data set, which was developed by researchers working for IBM. As its name suggests, the data set was developed by IBM to help researchers reduce racial and other biases in face recognition technologies. The data set included both photos and “facial coding schemes,” created and implemented by IBM researchers, which measured aspects of the facial features of the individuals appearing in those photos.

Consistent with the purpose of the data set, IBM made the Diversity in Faces data set available to others, free of charge, provided they agreed to use it solely for noncommercial research purposes and agreed not to use it to identify the people appearing in the data set’s photographs.

In granting summary judgment, the court found that researchers working with Amazon and Microsoft accessed the Diversity in Faces data set but determined that it was not suitable for their respective research goals for various reasons—for example, because the photos were not from a demographically balanced data set or because they were not taken from the correct angle. Plaintiffs Vance and Janecyk nonetheless claimed that because plaintiffs were Illinois residents, Amazon and Microsoft were required to provide BIPA-compliant notice and obtain BIPA-compliant consent before obtaining their biometric data through the Diversity in Faces data set. The plaintiffs also alleged that Amazon and Microsoft impermissibly profited from their biometric data in violation of BIPA 15(c). (The profit claim against Microsoft was dismissed on the pleadings in March 2022. See Vance v. Microsoft Corp., 534 F. Supp. 3d 1301, 1309 (W.D. Wash. 2021).) Finally, the plaintiffs asserted a claim for unjust enrichment in both cases.

In May 2022, after more than two years of active litigation, Amazon and Microsoft moved for summary judgment, arguing that the BIPA claims should be dismissed on the grounds that (1) BIPA does not apply extraterritorially and all of the relevant conduct occurred outside Illinois, (2) applying BIPA as plaintiffs sought to do would violate the dormant Commerce Clause of the U.S. Constitution, and (3) BIPA should not be construed to require (impossible) notice and consent from the unidentified individuals in the Diversity in Faces data set. Amazon and Microsoft also argued that (4) the plaintiffs failed to prove that Amazon or Microsoft unjustly retained a benefit to the plaintiffs’ detriment, so their unjust enrichment claim should be dismissed. Judge Robart granted the motions and dismissed all claims against both Amazon and Microsoft based on the first and fourth arguments.

In dismissing the BIPA claim based on the extraterritoriality defense, Judge Robart (like every other court to have considered the issue) agreed that BIPA does not apply extraterritorially; rather, the law only applies if the relevant activities “occurred primarily and substantially in Illinois.” The plaintiffs had argued that this test was satisfied because they were Illinois residents, they suffered injury in Illinois, their photos were taken in Illinois, and their photos were uploaded to the internet in Illinois. The court held that these allegations failed to identify any conduct by Amazon or Microsoft that took place primarily or substantially in Illinois. The court also noted that even if plaintiffs’ photos were originally uploaded in Illinois, they were uploaded to the third-party Flickr platform and developed by the third-party researchers at IBM. And although Amazon and Microsoft might have later received the data set, plaintiffs failed to identify any evidence that any of the defendants’ agents downloaded, reviewed, or evaluated the data set in Illinois. Rather, the evidence showed that any download, review, or evaluation by Amazon would have taken place where the companies’ respective employees or contractors were based—in Washington and Georgia, as to Amazon, and in Washington and New York, as to Microsoft.

In the Microsoft case, discovery also suggested that “encrypted chunks” of data bits from the Diversity in Faces data set might have been stored on servers in Chicago, Illinois, as well as other locations in Texas, Washington, and/or California. But Judge Robart found this possibility irrelevant because “as Microsoft points out, even if Plaintiffs could prove that Microsoft stored the [Diversity in Faces] Dataset in a datacenter in Illinois, the relevant section of BIPA regulates only the acquisition of data, rather than the encrypted storage of data after it is acquired.” Because plaintiffs’ only remaining claim asserted a violation of BIPA section 15(b), which requires notice and consent before acquiring biometric data, and because Microsoft did not obtain the biometric data in Illinois or profit from it there, the possibility that Microsoft might have stored data in Illinois was irrelevant.

Judge Robart also dismissed plaintiffs’ claim for unjust enrichment, finding that neither Amazon nor Microsoft used the Diversity in Faces data set in their businesses, so they could not have unjustly retained a benefit to the plaintiffs’ detriment.

© 2022 Perkins Coie LLP


 

Sign up for the latest legal news and insights  >