Article 26 Deployer Obligations
The operational playbook for deployers of high-risk AI systems. What you must do, who owns it, and what evidence you need.
The 8 Deployer Obligations
Article 26 sets out specific requirements for anyone deploying high-risk AI systems.
Use According to Instructions
Deployers must use high-risk AI systems in accordance with the instructions of use accompanying the system.
What to do:
- Obtain the provider's instructions for use
- Create internal SOPs based on these instructions
- Train operators on correct usage
- Document adherence to instructions
Human Oversight
Assign human oversight to persons who have the necessary competence, training, and authority.
What to do:
- Designate oversight owner for each system
- Document competence requirements
- Provide necessary training
- Grant authority to intervene and stop the system
Input Data Management
Where deployers control input data, ensure it is relevant and sufficiently representative.
What to do:
- Document what input data you control
- Establish data quality checks
- Monitor for bias and representativeness
- Keep records of data quality measures
Monitoring
Monitor the operation of the high-risk AI system based on the instructions of use.
What to do:
- Create monitoring plan per provider instructions
- Establish KPIs and thresholds
- Schedule regular monitoring reviews
- Document monitoring activities
Risk Reporting & Suspension
Inform the provider/distributor and suspend use if you identify risks to health, safety, or fundamental rights.
What to do:
- Create escalation procedures
- Define suspension criteria
- Establish provider communication channels
- Document incidents and responses
Serious Incident Reporting
Report serious incidents to the provider and relevant market surveillance authorities.
What to do:
- Define what constitutes a serious incident
- Create incident reporting procedures
- Identify relevant authorities
- Keep incident records
Log Retention
Keep logs automatically generated by the AI system under your control for at least 6 months.
What to do:
- Identify what logs are generated
- Ensure 6-month minimum retention
- Establish access controls
- Enable log export capability
Workplace Notification
If you're an employer using high-risk AI in the workplace, inform workers and their representatives.
What to do:
- Identify workplace AI uses
- Draft worker notification
- Communicate before deployment
- Document notification
Frequently Asked Questions
Who is a 'deployer' under the EU AI Act?
A deployer is any natural or legal person, public authority, agency, or other body using an AI system under its authority, except where the AI system is used in the course of a personal non-professional activity.
When do Article 26 obligations apply?
Article 26 obligations apply to deployers of high-risk AI systems. Most Article 26 obligations will apply from 2 August 2026, though you should prepare now.
What's the difference between provider and deployer obligations?
Providers (developers/manufacturers) have obligations around design, documentation, and conformity. Deployers (users) have obligations around use, oversight, monitoring, and incidents.
Do I need to do a FRIA as a deployer?
FRIA (Fundamental Rights Impact Assessment) is required for certain deployers—specifically public bodies and some private entities providing public services—before deploying certain high-risk AI systems.
Automate Your Article 26 Compliance
Klarvo auto-generates deployer checklists and tracks evidence for each obligation.