Editorial: Biometric privacy laws must evolve with the times
Who should be held liable when a person’s biometric data — their facial features, voice or fingerprints — is misused?
Illinois is home to some of the strongest consumer privacy regulations in the country, including its rules governing the use of biometric data.
Perhaps you were one of the folks who got a check from Facebook after a class-action lawsuit alleging violations of Illinois’ 2008 Biometric Information Privacy Act. Facebook’s “Tag Suggestions” feature used facial recognition to scan users’ uploaded photos, creating “face templates” or biometric identifiers.
Plaintiffs said this happened without the required written consent or publicly available retention/destruction policy. In February 2021, the case was resolved with a court-approved $650 million class-action settlement — widely described by the judge as a landmark in consumer privacy law.
Facebook isn’t the only company dealing with these lawsuits. A Chicago man sued Home Depot this month, alleging the retailer used facial recognition at self-checkouts without consent or required policies — a potential violation of BIPA.
One could argue that our biometric privacy rules are a very important tool in protecting personal privacy. Indeed, we are grateful for barriers to widespread misuse of such personal information. On the other hand, you could also argue Illinois’ biometric data laws are responsible for regulating a marketplace that is changing by the second — and they’re not keeping up. There’s truth to that, too.
Take this question, for example:
Should digital infrastructure providers like data centers or cloud platforms be held liable when someone’s biometric data is misused?
Legal minds are actively debating this question. And many in the industry — including those who lobby on its behalf — are concerned their network could be in the legal crosshairs unfairly if the rules aren’t updated.
Illinois’ biometric rules apply to any “private entity” that collects, captures, purchases, receives through trade or otherwise obtains a person’s biometric identifiers or information. That means if a company is in possession of, or is making use of, biometric data, it has duties such as proper informed consent before collection and written, publicly available retention and destruction policies.
So far, most high-profile cases have focused on end-user firms such as employers, retailers and social-media platforms. But there’s no explicit carve-out in BIPA for data centers or cloud providers. If they merely store encrypted information, they can argue they’re not “collecting” or “using” biometrics. But if a provider offers biometric processing services — or fails to safeguard the data in its possession — plaintiffs could test the boundaries of liability.
It’s important to keep laws up to date with fast-moving technology.
Legislating is a slow process, but when you’re regulating something so big and fast-changing, you have to be nimble. If data centers played — or could play — an active role in policing use of biometrics, they would be fair game as potential defendants. But they don’t.
The data-center industry has a solid argument here, we believe, that they shouldn’t be liable under BIPA for misuse of people’s likenesses or identities.
— Chicago Tribune
Remove the ads from your TribLIVE reading experience but still support the journalists who create the content with TribLIVE Ad-Free.