EU Strikes Political Deal on AI Act Omnibus: High-Risk Deadlines Extended to 2027–2028

Read in other languages: 한국어日本語
AI May 7, 2026 By Insights AI 1 min read 1 views Source

The European Parliament and the Council of the EU reached a political agreement on May 7, 2026, to amend and simplify the AI Act, extending high-risk AI compliance deadlines by up to two years and adding new bans on AI-generated harmful content.

Deadline Extensions

The most consequential change is the delayed compliance timeline for high-risk AI. Stand-alone high-risk systems — covering biometrics, critical infrastructure, education, employment, law enforcement, and border management — must now comply by December 2, 2027, down from August 2026. High-risk AI embedded in regulated products (machinery, medical devices, toys) has until August 2, 2028. For companies that had not yet begun compliance work, this is a significant reprieve.

New Prohibitions

The omnibus deal adds explicit bans on AI-generated non-consensual intimate imagery (NCII) and child sexual abuse material (CSAM). Compliance with these provisions is required by December 2, 2026 — well before the high-risk deadlines. The ban sets a global precedent for AI-generated harmful content legislation.

Business-Friendly Simplifications

The agreement extends SME exemptions to small mid-cap companies, reduces overlaps between the AI Act and sectoral regulations, expands access to EU-level regulatory sandboxes, and delays AI watermarking obligations to December 2026. The Commission's AI Office receives enhanced enforcement powers, and exempted high-risk systems must register in a mandatory database.

What Comes Next

Formal adoption is expected before August 2, 2026. The rules apply to all companies operating in the EU, including US entities. Industry group CCIA criticized the deal for lacking clear innovation exemptions, while most enterprise AI teams welcomed the extended runway. Source: IEU Monitoring.

Share: Long

Related Articles

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment