The EU Artificial Intelligence Act (AI Act) provides a legal framework for AI developers, deployers, and importers. It officially came into force on 1 August 2024 and adopts a risk-based approach, meaning different obligations apply depending on the level of risk involved.
The EU’s approach to AI regulation ensures that higher-risk AI applications, particularly those that can significantly impact fundamental rights, are either prohibited or subjected to stricter requirements and oversight. High-risk AI systems include those using biometrics or those used in critical infrastructure, education, employment, self-employment, essential private/public services, law enforcement, migration, asylum, border control, justice administration, and democratic processes. AI systems that are safety components of devices or are devices covered by EU product safety legislation are also considered high-risk.
Although the AI Act will be implemented gradually, all provisions relating to high-risk AI practices will become enforceable by August 2027.
Join us for the second webinar in our EU AI Act webinar series for a detailed exploration of high-risk AI systems, where we will highlight the key legal aspects you need to be aware of when working with AI systems.
Whether you are an AI developer, deployer, distributor, or importer, this webinar is essential for understanding the obligations required when creating high-risk AI systems under the AI Act.
In this webinar, we will cover the following topics:
This webinar will be streamed on LinkedIn Live. By clicking the RSVP button below you will be redirected to the LinkedIn event page. Please select "Attend" to confirm your registration. The live stream can be viewed in the feed on the event page.
If you do not have a LinkedIn account or do not wish to watch on LinkedIn, please add to your calendar here with the link to join on Streamyard. Please contact us with any queries regarding this process.
If you missed the first webinar in our series covering prohibited practices and compliance strategies, you can watch the recording here.