The EU AI Act’s AI Literacy obligations explained

Article 4 of the EU AI Act requires that providers and deployers of AI systems ensure a "sufficient level of AI literacy" for their staff, as well as any other persons using and operating AI systems on their behalf. But what does AI literacy actually mean in practice, and what might your company’s obligations be?

Under the Act, AI literacy is defined as the “skills, knowledge and understanding that allow providers, deployers and affected persons … to make an informed deployment of AI systems, as well as to gain awareness about the opportunities and risks of AI and possible harm it can cause”. The Act does not require that providers and deployers of AI systems actively measure the level of knowledge of their employees - they are only required to ensure a “sufficient level” of AI literacy, taking into account the technical experience and training of their employees. Organisations will also need to adapt their AI literacy approach to reflect the level of risk involved in operating their systems, as well as their position – provider or deployer - and whether their employees are fully informed on the legal and ethical aspects of AI.

This AI literacy obligation covers anyone who falls under the broad remit of an organisation, and who is directly dealing with the use of AI systems. This includes both employees and third parties such as contractors, service providers, and clients.

The EU AI Office considers a certain degree of flexibility to be necessary in the interpretation of what a “sufficient level” of AI literacy entails. At a minimum, organisations should do the following to build their AI literacy programme:

  1. Ensure a general understanding of AI, including what AI is, how it works, how they are using it, and what the opportunities and dangers entail.

  2. Consider their role as either provider or deployer of AI systems, and whether they are developing AI systems or using those developed by another organisation.

  3. Consider the risks of the AI systems being provided or deployed, and what employees need to be aware of when using them.

  4. Build AI literacy actions on the preceding three points, considering differences in technical knowledge of staff and other persons, and the contexts in which the AI systems are to be used.

These points should all include coverage of legal and ethical aspects of the use of AI, and the Act encourages connections to the principles of ethics and governance.

Penalties and other enforcement measures could result from failures to comply with the obligations of Article 4. The legal framework applies to public and private actors inside and outside the EU, as long as “the AI system is placed on the Union market, used in the Union or its use has an impact on people located in the EU”.

More information on the specifics of Article 4 can be found here: https://digital-strategy.ec.europa.eu/en/faqs/ai-literacy-questions-answers

At L-EV8, we are perfectly placed to provide board-level training and awareness for companies to meet their AI literacy obligations. Building on the most up-to-date knowledge in the field, we can help ensure that your organisation stays one step ahead of the latest developments in AI law, while also effectively deploying AI technology to maintain competetive advantage in a rapidly changing world.

Reach out to us via the contact links at the top of the page to learn more about what we can do for you.

Next
Next

Jonathan Armstrong is named to the NYSBA AI & Emerging Tech Comittee