The EU AI Act, proposed by the European Commission in April 2021, aims to regulate the use of AI in the EU by protecting users from AI-related harm and prioritizing human rights. Using a risk-based approach, the EU AI Act imposes obligations that are proportional to the risk posed by an AI system or a general purpose AI model.
After a lengthy consultation process that saw several amendments proposed, it passed in the European Parliament in June 2023, marking the start of a six-month Trilogue period. At the end of this process, a provisional agreement was reached in December 2023, before the Coreper I (Committee of the Permanent Representatives) reached a political agreement in February 2024.
This landmark legislation is set to become the global gold standard for AI legislation and will have important implications for organizations both within and outside of the EU due to its extraterrestrial scope.
The EU AI Act adopts a risk-based approach to regulation, categorizing AI systems into three main risk levels:
Additionally, there are AI systems posing limited transparency risks, such as emotion recognition or deep fake generation, addressed under Article 52.
New categories introduced by the Council and Parliament drafts include foundational models and general-purpose AI systems, each subject to distinct requirements. In the latest version of the text, these provisions are combined under a new chapter dedicated to general purpose AI models along with a stricter regime for high-impact general purpose AI models that may pose systemic risk.
The classification of AI systems remains a crucial step for enterprises to ensure compliance, particularly if their systems are deemed prohibited or high-risk under the EU AI Act.
To assist organizations in navigating the complexities of the AI Act, Holistic Ai has developed the AI Act Readiness Assessment. Our assessment aims to:
The EU AI Act readiness assessment encompasses a structured series of steps and evaluations. Initially, it is essential to accurately identify AI systems, their risk categories, and the roles of the entities involved. This initial step is crucial, as it determines the subsequent mapping of technical requirements and obligations for operators.
The readiness assessment not only examines the nature and requirements of AI systems but also provides solutions that support the fulfillment of these requirements and obligations, either fully or partially. This ensures that entities are not just informed about their duties under the AI Act but are also equipped with the necessary resources and strategies to align with the regulatory framework.
The EU AI Act's internal conformity assessment procedure hinges on an examination of an organizations’ AI systems' alignment with mapped requirements and obligations. In this context, the readiness assessment is specifically designed to prepare organizations not only for internal evaluations but also for meeting the standards set by the EU AI Act’s conformity assessment.
Given that achieving compliance can be more challenging and costly once AI systems are operational, early preparation is essential.Start your journey towards AI Act compliance today with our AI Act Readiness Assessment. Ensure your organization is prepared for the future of AI regulation in the European Union.
The Act applies to providers, deployers, distributors, and importers of AI systems that are placed on the market or put in service within the European Union. The level of preparedness required under the Act is different for each operator. For providers and deployers, the Act may also apply extraterritorially, meaning that providers and deployers of AI system may need to prepare for the EU AI Act even if they are based outside the European Union.
The Act introduces separate risk-based classifications for AI systems and general-purpose AI (GPAI) models. There are three main risk levels for AI systems:
There is a subset of AI systems that are associated with certain transparency obligations. These are commonly labeled as “limited risk” systems in practice. However, this is not an exclusive risk-level to the classification mentioned above. Both high-risk and minimal-risk AI systems may face these transparency obligations depending on their functions.
Regarding the GPAI models, the Act classifies a group of them as the GPAIs with systemic risk provided that these models have high impact capabilities. It also sets forth that GPAI models for the training of which the cumulative amount of computing power used is greater than 10^25 measured in floating point operations (FLOPs) shall be presumed to have high impact capabilities.
There are seven key design-related requirements for the high-risk AI systems under the EU AI Act:
Non-compliance with the provisions of the EU AI Act sanctioned with hefty administrative fines.
The Act is currently going through the final stages of the EU legislation-making procedure. The Act is expected to be officially adopted in Early 2024. Most of its provisions will start applying 24 months after the adoption, with exceptions to certain provisions. The earliest application date belongs to the provisions on prohibited AI practices, which will apply starting from 6 months after the entry into force of the Act.
The AI Act is a comprehensive framework and getting ready for it is not possible overnight but will take time. The general application date for the Act may not seem close, but it must not be forgotten that some of the provisions of the Act are likely to start applying by the end of 2024. Additionally, getting ahead and starting preparations early is important for entities seeking to have a competitive advantage.
The Act introduces a set of design-related requirements for AI systems and obligations for covered entities. The classification of AI systems, identification of the respective obligations, and preparation for compliance take both time and resources. These could significantly affect the operations of AI developers, deployers, and users, or even, require the termination of operations for some AI systems.
The AI Act is a comprehensive framework and getting ready for it is not possible overnight but will take time. The general application date for the Act may not seem close, but it must not be forgotten that some of the provisions of the Act are likely to start applying by the end of 2024. Additionally, getting ahead and starting preparations early is important for entities seeking to have a competitive advantage.