AI Act Article 73 Explained: Incident Reporting Requirements for US and EU Companies
What are the incident reporting obligations under the EU AI Act Article 73? Learn who must report what to report and how to comply. Avoid fines up to 7% of global revenue.
AI Act Article 73 Explained Incident Reporting Requirements for US and EU Companies
Last updated May 17 2026 Reading time 10 minutes
Your AI System Just Caused an Incident Now What
The EU AI Act is not only about preventing risks. It is also about responding to them. Article 73 introduces strict incident reporting obligations. These apply to providers and deployers of high risk AI systems.
An incident can be any malfunction. It can be any unintended behavior. It can be any breach of fundamental rights. If it results in harm or could result in harm it must be reported.
For US companies this is critical. If your AI system is used in the EU and an incident occurs you must report it. Failure to do so can lead to fines up to 7% of your global revenue.
This guide will explain what constitutes an incident. It will show you who must report. It will walk you through the reporting process. It will help you avoid the pitfalls of non compliance.
What Counts as an Incident Under Article 73
Not every issue with an AI system is an incident. But many are. Article 73 defines an incident as any malfunction. Any unintended behavior. Any breach of fundamental rights. The key is the potential or actual harm.
Here are some examples.
A medical diagnosis AI misdiagnoses a patient. The misdiagnosis leads to delayed treatment. This is an incident.
An HR AI system discriminates against a group of job applicants. The discrimination is based on gender. This is an incident.
A credit scoring AI denies loans to a specific demographic. The denial is based on biased data. This is an incident.
A chatbot provides harmful advice. The advice leads to financial loss. This is an incident.
A predictive policing AI leads to wrongful arrests. The arrests are based on flawed predictions. This is an incident.
Who Must Report Incidents
The obligation to report incidents falls on both providers and deployers of high risk AI systems.
Providers are the companies that develop or supply the AI system.
Deployers are the companies that use the AI system in their operations.
For US companies this means if you provide or deploy a high risk AI system in the EU you must report incidents.
Here is a breakdown.
| Role | Responsibility | Example US | Example EU |
|---|---|---|---|
| Provider | Report any incident related to the AI system | A US based AI developer | A French AI startup |
| Deployer | Report any incident related to the use of the AI system | A US company using AI in its EU operations | A German bank using an AI credit scoring system |
What Must Be Reported
When an incident occurs you must report specific information. This information helps regulators understand the incident. It helps them assess the risk. It helps them take appropriate action.
Here is what must be included in the report.
Mandatory Information
1 Description of the Incident What happened When it happened Where it happened
2 AI System Involved The name and type of the AI system The version of the AI system The intended purpose of the AI system
3 Impact of the Incident The nature of the harm caused The severity of the harm The number of people affected
4 Actions Taken The steps taken to mitigate the harm The steps taken to prevent recurrence The steps taken to inform affected individuals
5 Contact Information The name and contact details of the person reporting the incident The name and contact details of the provider or deployer
When Must Incidents Be Reported
Timing is crucial. The AI Act sets strict deadlines for reporting incidents.
For incidents that result in serious harm or could result in serious harm you must report within 24 hours.
For other incidents you must report within 15 days.
Serious harm includes
Physical injury Death Significant financial loss Violation of fundamental rights
Real World Examples of Incident Reporting
Medical Diagnosis AI Misdiagnosis
A US based healthcare AI is used in a French hospital. The AI misdiagnoses a patient with a serious condition. The misdiagnosis leads to delayed treatment. The patient’s condition worsens.
The hospital must report the incident within 24 hours. The report must include a description of the incident. It must include the impact on the patient. It must include the actions taken to mitigate the harm.
The provider of the AI system must also report the incident. They must investigate the cause. They must take steps to prevent recurrence.
HR AI System Discrimination
A multinational company uses an AI system for recruitment. The system is used in both the US and the EU. The AI discriminates against a group of job applicants. The discrimination is based on gender.
The company must report the incident within 15 days. The report must include a description of the discrimination. It must include the number of people affected. It must include the actions taken to address the issue.
The provider of the AI system must also report the incident. They must investigate the cause of the discrimination. They must take steps to prevent it from happening again.
Credit Scoring AI Bias
A US fintech company provides an AI credit scoring system to European banks. The AI denies loans to a specific demographic. The denial is based on biased data.
The fintech company must report the incident within 15 days. The report must include a description of the bias. It must include the impact on the affected individuals. It must include the actions taken to correct the bias.
The banks using the AI system must also report the incident. They must inform the affected individuals. They must take steps to mitigate the harm.
How to Report an Incident Step by Step
Reporting an incident involves several steps. Here is a detailed guide.
Step 1 Identify the Incident
Determine if the issue qualifies as an incident under Article 73. Consider the potential or actual harm. Consider the impact on fundamental rights.
Step 2 Gather the Required Information
Collect all the information listed above. Ensure it is accurate. Ensure it is complete.
Step 3 Submit the Report to the National Authority
Each EU member state has a national authority responsible for receiving incident reports. Submit your report to the relevant authority.
For US companies with no EU presence you must submit the report through your authorized representative.
Step 4 Investigate the Incident
Conduct a thorough investigation. Identify the cause of the incident. Determine how to prevent recurrence.
Step 5 Take Corrective Actions
Implement measures to mitigate the harm. Implement measures to prevent future incidents. Inform affected individuals if necessary.
Step 6 Document Everything
Keep records of the incident. Keep records of the report. Keep records of the investigation. Keep records of the corrective actions.
Common Mistakes in Incident Reporting
Mistake 1 Failing to Recognize an Incident
Many companies fail to recognize that an issue qualifies as an incident. If there is potential or actual harm it must be reported.
Always err on the side of caution. When in doubt report.
Mistake 2 Missing the Reporting Deadline
The AI Act sets strict deadlines. For serious harm it is 24 hours. For other incidents it is 15 days.
Missing the deadline can result in non compliance. It can result in fines.
Mistake 3 Submitting Incomplete or Inaccurate Information
The report must be complete. It must be accurate. Submitting incomplete or inaccurate information can lead to rejection. It can lead to legal consequences.
Ensure all information is thorough. Ensure it is precise.
Mistake 4 Not Investigating the Incident
Reporting the incident is not enough. You must also investigate it. You must identify the cause. You must take steps to prevent recurrence.
Failing to investigate can result in non compliance.
Mistake 5 Not Informing Affected Individuals
In some cases you must inform affected individuals. This is especially true if the incident involves personal data.
Failing to inform can result in non compliance with both the AI Act and GDPR.
How DilAIg Simplifies Incident Reporting
Reporting an incident can be complex. DilAIg simplifies the process.
Our tool guides you through each step. It helps you determine if an issue qualifies as an incident. It assists in gathering the required information. It generates the necessary report.
For US companies our tool ensures compliance with both US and EU regulations. It flags EU specific requirements. It helps you navigate the complexities of the AI Act.
Here is how it works.
1 Answer a series of questions about the incident. What happened? When did it happen? What was the impact?
2 Our tool analyzes your responses. It determines if the incident must be reported. It identifies the information you need to include.
3 We generate a comprehensive incident report. It includes all the necessary details. It is ready for submission to the national authority.
4 We provide guidance on investigating the incident. We help you develop corrective actions.
Report an incident today. Start Your Incident Report
FAQ Incident Reporting Requirements
Q What constitutes an incident under Article 73
An incident is any malfunction unintended behavior or breach of fundamental rights that results in harm or could result in harm.
Q Who is responsible for reporting incidents
Both providers and deployers of high risk AI systems must report incidents.
Q What is the deadline for reporting incidents
For incidents resulting in serious harm the deadline is 24 hours. For other incidents it is 15 days.
Q What information must be included in the incident report
The report must include a description of the incident the AI system involved the impact the actions taken and contact information.
Q What happens if I do not report an incident
You could face fines up to 7% of your global revenue or 35 million euros. You could also face reputational damage and loss of trust.
Q Do I need to report incidents that do not cause harm
Yes if the incident could result in harm it must be reported.
Q Can I report an incident anonymously
No. The report must include your contact information. This allows regulators to follow up if necessary.
Key Takeaways
Article 73 of the EU AI Act introduces strict incident reporting obligations. These apply to providers and deployers of high risk AI systems. An incident is any malfunction unintended behavior or breach of fundamental rights that results in harm or could result in harm.
For serious harm the reporting deadline is 24 hours. For other incidents it is 15 days. The report must include a description of the incident the AI system involved the impact the actions taken and contact information.
Common mistakes include failing to recognize an incident and missing the reporting deadline. DilAIg’s tool simplifies incident reporting. It guides you through each step. It generates the necessary report.
Next Steps
Familiarize yourself with the incident reporting requirements. Review the Guidelines
Report any incidents involving your AI systems. Start Your Incident Report
Need help? Book a Demo
Join the Conversation
Have you reported an AI related incident? What challenges did you face? Share your thoughts in the comments or tweet us @DilAIg.
Further Reading
Official EU AI Act Text Article 73 European Commission Incident Reporting Guidelines DilAIg’s AI Act Compliance Hub How to Conduct an Incident Investigation
This article is part of DilAIg’s AI Act Compliance Series. Next up AI Act Annex III High Risk AI Systems Explained