The Illinois Biometric Data Regulation (BIPA) as it compares to the EU AI Act and other US Federal and State Legislation governing ther use of Biometric Data

The regulation of biometric data has become increasingly critical as artificial intelligence (AI) and biometric technologies continue to advance. The U.S. and the European Union (EU) have each developed frameworks to govern the collection and use of biometric data, with varying approaches, particularly concerning real-time versus non-real-time processing of such data, the applicability and the means of enforcement.

biometric data bipa wittmann law

1. Scope and Applicability

Illinois Biometric Information Privacy Act (BIPA):

Scope: In 2008, Illinois became the first state to enact a biometric data privacy law. BIPA specifically governs the collection, storage, and handling of biometric identifiers, including fingerprints, facial recognition data, retina scans, and more. Notably, BIPA does not differentiate between real-time and non-real-time processing of biometric data. Whether biometric data is captured and used immediately (real-time) or stored for later use (non-real-time), the same stringent requirements apply. Companies must obtain informed written consent, explain the purpose of data collection, and detail retention and destruction policies.

Applicability: BIPA applies to private entities that collect or handle biometric data of Illinois residents, regardless of where the company is based. The law's extraterritorial application means that even companies outside Illinois, or the U.S., could be subject to BIPA if they engage with Illinois residents' biometric data. Exceptions exist for government entities, financial institutions, and public schools, which are governed by other regulations.

Exemptions and Other Applicable Legislation: BIPA’s stringent requirements apply solely to private entities and exclude public entities, financial institutions, and public schools. These sectors are either regulated by other specific laws or are not covered under BIPA. For example:

  • Public Entities: While BIPA does not govern public entities, biometric data the Fourth Amendment, which guards against unreasonable searches and seizures applies. Additionally, President Biden’s Executive Order on AI, issued in 2023, plays a critical role in shaping the U.S. approach to AI governance, including biometric data regulation. The Executive Order emphasizes principles of privacy protection, civil rights, and the mitigation of bias in AI systems. It calls for the development of standards and frameworks that ensure responsible AI use across both public and private sectors. The implementation of this Executive Order has led to federal agencies, such as the National Institute of Standards and Technology (NIST), creating guidelines that complement BIPA and other state laws. These guidelines promote fairness, accountability, and transparency in AI systems, affecting how biometric data is managed, regardless of when it is processed.
  • Financial Institutions: These are often regulated under the Gramm-Leach-Bliley Act (GLBA), which mandates the protection of personal data, including potentially biometric data, within the financial sector.
  • Other U.S. Legislation: At the federal level, there is no comprehensive legislation equivalent to BIPA. However, sector-specific laws such as the Health Insurance Portability and Accountability Act (HIPAA) may apply if biometric data is used in healthcare settings. Additionally, the emerging American Data Privacy Protection Act (ADPPA) could introduce more generalized data privacy rules across various sectors.

Which other U.S. States currently have biometric privacy laws?

  • Texas and Washington also have broad biometric privacy laws on the books, but neither creates a private right of action like BIPA does. In addition, California, Colorado, Connecticut, Utah, Rhode Island, Vermont—although recently vetoed by the Governor, and Virginia, have passed comprehensive consumer privacy laws that, once in full effect, will expressly govern the processing of biometric information. And even more states have enacted data breach notification laws that explicitly include biometric data within their scope.
  • Various municipalities, such as New York City and Portland, Ore., have also passed tailored biometric privacy measures. New York City’s Biometric Information Privacy Law, applicable to certain commercial establishments, provides a private right of action.
  • As more states continue to introduce legislation similar to BIPA, insurers have begun expressly excluding biometric liability coverage from their policies, further adding to the risks posed by noncompliance with biometric privacy laws.

EU Artificial Intelligence (AI) Act compared

Scope: The EU AI Act regulates AI systems, particularly those classified as "high-risk," including biometric data systems such as facial recognition. The Act differentiates between real-time biometric identification, like live facial recognition in public spaces, and non-real- time processing, imposing stricter controls

Applicability: The AI Act applies to AI system providers, users, and distributors within the EU, as well as those outside the EU if their systems affect people within the EU. This regulation extends to both government and private sector applications, with distinct rules depending on the context of use.

2. Consent and Transparency Requirements:

BIPA:

Consent: BIPA mandates explicit, informed written consent before collecting or using biometric data. Individuals must be informed of the purpose and duration of the data use. This requirement is the same regardless of whether the biometric data is collected in real- time or non-real-time.

Transparency: Companies must publicly disclose their policies on data retention and destruction. Biometric data must be destroyed once the initial purpose is fulfilled or within three years of the last interaction with the individual.

EU AI Act:

Consent: The AI Act does not have a specific consent requirement for biometric data, relying instead on the GDPR's consent framework. However, for "high-risk" AI systems, including those processing biometric data, there are additional requirements for transparency, accountability, and risk management, especially for real-time systems.

Transparency: High-risk AI systems must be transparent in their operation. For real- time biometric systems like facial recognition in public, stringent measures ensure users understand how and why their data is being processed.

3. Data Protection and Security Requirements

BIPA:

Data Protection: While BIPA does not prescribe specific technical measures, it requires that biometric data be safeguarded in line with industry standards. This includes both real-time and non-real-time data processing.

Security Requirements: Although BIPA is not explicit about security protocols, the law implies a need for robust protection against unauthorized access. Companies must implement reasonable safeguards that align with industry practices.

EU AI Act:

Data Protection: The AI Act demands high standards for data quality, accuracy, and security, particularly for high-risk systems. This includes stringent requirements for real-time biometric systems, which are subject to comprehensive testing and documentation.

Security Requirements: High-risk AI systems, especially those handling biometric data, must undergo rigorous testing and include human oversight mechanisms to prevent misuse.

Self Certification: Companies can choose to self-certify to adhere to international security standards, such as ISO 27001. The the final draft for an international standard to govern Artificial Intelligence — Data quality for analytics and machine learning (ML)Part 2: Data quality measures. (ISO/IEC FDIS 5259-2) is currently at the approval phase and should be made available in the near future.

4. Risks for Non-Compliance

BIPA: Non-compliance with BIPA can lead to severe financial penalties. Individuals have the right to sue organizations for violations, leading to statutory damages of $1,000 per violation for negligent breaches and $5,000 for intentional breaches. Several prominent class-action lawsuits asserting violations of BIBA have already established legal precedence and demonstrate the risks involving non-compliance:

First, in 2019, the Illinois Supreme Court in Rosenbach v. Six Flags Entertainment Corp., held that a plaintiff can be considered an “aggrieved person” under the statute and “be entitled to liquidated damages and injunctive relief” without alleging an actual injury. Then, in May 2020, the U.S. Court of Appeals for the Seventh Circuit in Bryant v. Compass Group USA, Inc. clarified that such a person has suffered an injury-in-fact sufficient to support standing under BIPA Section 15(b).

In 2020, the Facebook BIPA class action lawsuit Patel v. Facebook Inc. reached a conclusion when Facebook agreed to a $650 million settlement, one of the largest consumer privacy settlements in U.S. history, to resolve claims it collected user biometric data without consent.

In October 2022, the first jury verdict in a BIPA class action lawsuit was handed down in Rogers v. BNSF Railway Company. Although the defending company announced its plans to appeal the decision of the District Court for the Northern District of Illinois, the plaintiffs’ success at the trial level may encourage individuals to pursue their own BIPA claims.

In February 2023, the Illinois Supreme Court held in Cothron v. White-Castle Systems, Inc., that a separate claim accrues under BIPA each time a private entity scans or transmits a person’s biometric identifier or information in violation of the law. The court also observed that BIPA damages are discretionary and not mandatory. Earlier the same month, the court ruled in Tims v. Black Horse Carriers, Inc., that a five-year limitations period applies to all claims arising under BIPA.

EU AI Act: The AI Act imposes hefty fines for non-compliance, with penalties up to €30 million or 6% of global turnover, whichever is higher, for severe violations like unauthorized use of high-risk AI systems. Lower fines apply for less severe breaches. The law went into force on 1 August 2024 and the first provisions will become mandatory for companies from February 2025. Further important regulations will follow in the subsequent months until the entire law, with the exception of certain high-risk AI systems, is applicable from 1 August 2026 with full enforceability of all provisions in August 2027.

5. Enforcement and Litigation:

BIPA Enforcement: is primarily enforced through private litigation, which has been a major driver of compliance. The potential for class-action lawsuits creates a significant incentive for companies to adhere to the law.

EU AI Act Enforcement: The AI Act will be enforced by national authorities in EU member states, with the European Commission overseeing cross-border compliance via the AI Office. The AI Act currently does not provide a private right of action, unlike BIPA, but GDPR provisions may allow for some private claims related to privacy violations. Class Action lawsuits for EU citizens are currently governed by EU Member State law and vary. However, the European Parlament and the Counsel have already introduced a Liability Directive under the AI Act aiming to harmonize the non-contractual civil liability rules to Artificial Intelligence, which shall not effect existing remedies addressing private claims.


6. Conclusion

The regulatory landscape for biometric data in the U.S. and EU reflects differing approaches to AI and data privacy. BIPA provides strong consumer protections without distinguishing between real-time and non-real-time data processing, relying heavily on private litigation for enforcement. The EU AI Act, by contrast, imposes stricter controls on real-time biometric systems due to their immediate impact on privacy, with enforcement primarily through regulatory bodies. Biden’s Executive Order on AI reinforces these efforts, promoting comprehensive AI governance that protects biometric data in all contexts. Together, these laws and regulations represent a robust framework for managing the risks and challenges associated with biometric data in an increasingly AI-driven world.

More Articles

Latest news about Data Protection, GDPR, AI Legislation and more

Digital Operational Resilience Act online payments

Digital Operational Resilience Act (DORA): Comprehensive Checklist for Companies

The EU's Digital Operational Resilience Act (DORA) will unify digital resilience regulations and impose new ICT risk management requirements on financial institutions starting January 17, 2025.

Read more
wittmann law data security

Preparing for the national transposition deadline of the NIS2 Directive

Prepare for NIS2 compliance: The updated Network and Information Systems Directive (NIS2) enhances EU-wide cybersecurity by focusing on risk management, supply chain security, and protecting essential services. Ensure your company is ready before the implementation deadline.

Read more
Contact us now