EU vs. US - Comparison on AI legislation
Comparing White House's Executive Order (EO) and the EU's AI Act. Both policies prioritize AI testing, monitoring, and privacy protection but differ in their approach. The EU Act is more comprehensive, risk-based, and regulates high-risk use cases while the US EO is more flexible, sector-specific, and addresses broader political dimensions. It is important to understand both approaches for effective privacy compliance and AI governance programs as businesses navigate overlapping compliance efforts globally.
EU -US Comparison on Artificial Intelligence Legislation
AI models by international malicious entities, a concern less emphasized in the AI Act. The EO directs government agencies to establish AI testbeds, influences industry standards through federal procurement, and touches on broader political dimensions like immigration and education, areas not explicitly covered by the AI Act.
Debates surround the AI Act's disclosure requirement for materials used in AI system training, while the EO seeks to clarify patent and copyright law boundaries concerning AI- supported creations. The EO uniquely addresses risks associated with using AI to design biological materials.
In summary, businesses operating globally may face overlapping compliance efforts for both the AI Act and the EO. While the EU leans towards formal compliance demonstration through supportive documentation, the U.S. approach may, in some cases, require alignment with industry standards.
Understanding the common ground between these regimes is crucial for developing a global compliance strategy. The issuance of the EO on the same day as the G7 approval of the AI Code of Conduct suggests shared principles, and ongoing efforts toward harmonization across standards will be monitored and advised upon.
Overall, the EU AI Act is more comprehensive and prescriptive than the US Executive Order. It takes a risk-based approach, prohibiting certain practices and identifying "high-risk" use cases. It also regulates foundation models and includes specific requirements for testing, monitoring, and privacy.
The US Executive Order, on the other hand, is more flexible and sector-specific. It empowers executive departments to establish standards and encourages industry standards and guidelines. It also addresses some broader political dimensions, such as immigration and education.
Businesses operating globally may face overlapping compliance efforts for both the EU Proposed AI Act and the US Executive Order. It is important to understand the common ground between these regimes and to develop a global compliance strategy.
Feature | US Executive Order | EU Proposed AI Act | OECD Guidelines |
---|---|---|---|
Scope | Sector-specific, empowering executive departments to establish standards | Comprehensive, EU- wide regulation applicable directly to the private sector | High-level guidance for governments, businesses, and AI researchers |
Approach | Emphasizes industry standards and guidelines | Enforces binding regulations with associated fines | Encourages voluntary compliance and industry self-regulation |
Risk-based approach | No | Yes, prohibits certain practices and identifies "high-risk" use cases | Not explicitly stated, but risk-based approach is implied |
Foundation models | Not explicitly regulated | Regulated | Not explicitly addressed, but the OECD recognizes the importance of foundational models |
Testing and monitoring | Emphasized | Emphasized | Encouraged |
Privacy | Necessitates formulation of a relevant privacy regime in the absence of nationwide legislation | Relies on the GDPR framework for privacy protection | Emphasizes the importance of data protection and privacy |
Cybersecurity | Underscored | Underscored | Acknowledges the importance of cybersecurity but does not provide specific guidance |
International malicious entities | Proactively addressed | Less emphasized | Encourages international cooperation to address the threat of malicious AI use |
Government AI testbeds | Yes | No | Encourages governments to establish AI testbeds |
Industry standards | Influences through federal procurement | No | Encourages industry standards and self- regulation |
Immigration and education | Touched upon | Not explicitly covered | Does not address immigration or education |
Disclosure requirement for training materials | Not explicitly required | Debated | Encourages transparency in training materials |
Patent and copyright law | Seeks to clarify boundaries | No | Does not address patent and copyright law |
Risks associated with using AI to design biological materials | Uniquely addressed | No | Acknowledges the potential risks of using AI to design biological materials |
Overall | More flexible and sector-specific | More comprehensive and prescriptive | High-level guidance |
More Articles
Latest news about Data Protection, GDPR, AI Legislation and more