The EU AI regulation is coming. What does it mean for you and your business?
Artificial Intelligence (AI) is changing business like never before. It's being used in all sorts of sectors, from personalised healthcare solutions to autonomous vehicles and smart digital assistants. But as AI becomes more integrated, people are starting to worry about its impact on rights, regulations, and ethical practices. In response, the European Union (EU) has introduced new rules to regulate the development and use of AI.
The EU AI Act, or the EU Artificial Intelligence Act, is the first major step in the world toward creating clear regulations for AI.
Who is Impacted by the EU AI Act?
The AI Act affects all businesses operating within the EU, whether they are:
- involved in developing (providers)
- using (deployers)
- importing
- Distributing
- or manufacturing AI systems.
The new legislation holds them accountable for compliance with the defined rules. This means that all these people must make sure that their AI practices comply with the requirements outlined in the AI Act.
High-risk AI systems: compliance requirements and penalties
The AI Act adopts a risk-based approach, defining AI systems into four categories of risk, each with its own set of requirements. Depending on the risk category, an AI system is automatically banned or subject to less strict or stricter requirements.
- AI systems that endanger safety, rights, or privacy are considered ‘unacceptable risks’ and are no longer allowed. Examples include AI for social scoring or exploiting of people’s vulnerabilities.
- AI systems using AI in essential areas such as health, safety, and fundamental rights are categorized as “high risk”. Examples include AI in medical devices, recruitment processes, or credit scoring. Companies working with high-risk AI must comply with stringent requirements for data policy, transparency, human oversight, and accuracy.
- AI systems with moderate risks, such as chatbots or biometric identification are considered as “limited risk” and require transparency. For instance, users must be informed when interacting with AI rather than a human.
- The Minimal Risk-category comprises the majority of AI applications, such as AI in video games or simple automated processes, fall into this category. They are subject to minimal regulation due to their low risk.
The AI Act also includes some extra specific rules that apply to Generative AI (GenAI) models. These rules are designed to address the unique risks and challenges posed by GenAI systems regarding transparency, risk management, safety requirements, data governance and bias mitigation and technical documentation.
How businesses can prepare for AI Act compliance
What if your company uses or offers AI solutions? This means you should prepare for compliance.
Companies should first assess if they have AI systems. Either already in use, in development or could procure such systems from third-party providers. If so, you could list these identified AI systems in a repository. Based on this repository, the AI systems can be classified by risk
This involves meeting strict requirements around risk management, data handling, transparency, and human oversight. Companies must also maintain technical documentation. As a result, new industry standards for AI are being developed to guide compliance.
Key deadlines for the EU AI Act
The AI Act entered into force on August 1, 2024, and will be rolled out step by step. These are the key dates:
- February 2nd 2025:
- using or creating high risk systems is prohibited.
- Additional general requirement that employees who handle AI systems must have sufficient knowledge about them (AI literacy)
- August 2nd 2025: Rules for Gen AI systems apply
- August 2nd 2026: High risk systems should be compliant
- August 2nd 2027: AI systems embedded in regulated product (so called “NFL products such as medical devices, machinery, elevators and toys) must be compliant.
EU AI Act penalties: What non-compliance could cost you
Not complying with the AI Act can result in significant fines, ranging from €750.000 to €35.000.000 or from 1 % to 7% of the company's global annual turnover. Therefore, it is crucial for companies to ensure that they fully understand the provisions of the AI Act and comply with its requirements to avoid such sanctions
The rules are going to keep changing as new AI tech and risks come along. Companies need to stay on top of it all and keep their AI systems up to date to make sure they're compliant.
Questions?
The Sirris Patent Cell, founded with the support of FOD Economy, is your contact point for all your questions related to this issue. Our experts Benoit Olbrechts and Katrien Meuwis are just one click away.