Ninth Circuit Rules That Facebook’s Facial Recognition Could Violate Biometric Law

Ninth Circuit Rules That Facebook’s Facial Recognition Could Violate Biometric Law

On August 9, 2019, the U.S. Court of Appeals for the Ninth Circuit unanimously upheld certification of a class of Illinois Facebook users alleging that the social media giant’s facial recognition feature called “tag suggestions” violates the Illinois Biometric Information Privacy Act (“BIPA”). The decision is an important one for its interpretation of the Illinois law, for its discussion of the ways in which new technology shapes privacy risks and harms, and for its conclusion that plaintiffs can establish standing in this case solely based on violation of BIPA’s procedural requirements, without having to demonstrate that information about them was used in some more broadly harmful way. 

In its discussion of the impact of new technology on individual privacy interests, the Court pointed to the intersection between Constitutionally-protected privacy interests and the invasions of privacy that are actionable under common law theories of tort.  Specifically, the court pointed to recent U.S. Supreme Court decisions noting that, in the Fourth Amendment context, “advances in technology can increase the potential for unreasonable intrusions into personal privacy.” The Ninth Circuit explained that “the facial-recognition technology at issue here can obtain information that is ‘detailed, encyclopedic, and effortlessly compiled,’ which would be almost impossible without such technology.”  Not only does the court view this technology as powerful; it sees it as readily capable of inflicting harm on individuals if applicable regulations, like BIPA, aren’t followed. Further, the Ninth Circuit looks to Supreme Court precedent in Carpenter v. U.S. as suggesting that courts should take into account potential future uses of the technology.  The Facebook “tag suggestion” could, the court notes, in the future be used to identify individuals based on surveillance photos from cameras on streets or in office buildings, or be used to unlock a person’s cell phone. Based on both present and future potential uses of facial recognition technology, the court writes that, “We conclude that development of a face template using facial recognition technology without consent (as alleged here) invades an individual’s private affairs and concrete interests,” and consequently creates a substantive basis for a lawsuit.

The Ninth Circuit rejected Facebook’s argument that the three plaintiffs, Illinois residents, lacked standing to sue because they did not suffer the concrete harm required under the 2016 Supreme Court decision Spokeo, Inc. v. Robins. Writing for the Court, Judge Sandra Ikuta “conclude[d] that the development of a face template using facial-recognition technology without consent (as alleged here) invades an individual’s private affairs and concrete interests.” Judge Ikuta reasoned that the facial recognition feature “allows Facebook to create and use a face template and to retain this template for all time. Because the privacy right protected by BIPA is the right not to be subject to collection and use of such biometric data, Facebook’s alleged violation of these statutory requirements would necessarily violate the plaintiffs’ substantive privacy interests.” Judge Iukta further wrote, “Once a face template of an individual is created, Facebook can use it to identify that individual in any of the other hundreds of millions of photos uploaded to Facebook each day, as well as determine when the individual was present at a specific location.” As data privacy litigation has increased over the past decade, the federal judicial circuits have split in their views over what types of harm plaintiffs must show in order to meet Spokeo’s injury-in-fact requirement for standing to bring suit in federal court.  This decision is consistent with previous Ninth Circuit cases and therefore doesn’t resolve that split.  But the court’s analysis in this case is likely to prove influential as the existing circuit split continues to prompt hopes for an eventual Supreme Court resolution.

Turning to the issue of class certification, the Ninth Circuit rejected Facebook’s arguments that the Illinois law could not be applied to activities that took place outside of Illinois.  Facebook unsuccessfully argued that 1) for BIPA protections to apply, each class member had to prove he or she uploaded photos in Illinois, and 2) although facial data was scanned and stored on servers outside of Illinois, the company could be sued for violating the state law. The Ninth Circuit rejected both of these arguments, concluding that a finding of whether the violation occurred when out-of-state Facebook servers scanned users’ faces or when users uploaded photos in Illinois is a question that can be decided on a class-wide basis.  Given the cross-border nature of data use by social media platforms and other technology companies, the court’s reasoning could be influential in future cases where a company challenges the applicability of data breach or data privacy laws that have been passed by legislatures specifically intending to protect the residents of their own states.

Filed in 2015, this case addresses the Illinois law that is arguably the strictest law in the United States protecting biometric data.  Codified as 740 ILCS/14 and Public Act 095-994, the law defines biometric data as “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” The law requires a private entity to obtain a written release to collect biometric data and requires that the data is destroyed following a specific period of retention. Several states have narrow biometric privacy laws. For example, some of these laws prohibit state agencies from using biometric data in connection with ID cards.  Currently, only Illinois, Texas, and Washington have comprehensive biometric privacy laws in place, and California’s law is set to go into effect in 2020. However, BIPA is the only law that allows private individuals to file a lawsuit for damages stemming from a violation. The law provides for damages of $1,000 for each negligent violation and $5,000 for each intentional or reckless violation.  As noted above, this case and the ongoing litigation bear watching, as they may continue to influence future cases involving biometric information and other data privacy or data breach laws.

If you have any questions regarding an issue raised in this alert, please contact the authors or the attorney at Saul Ewing Arnstein & Lehr LLP with whom you are regularly in contact.

View Document(s):