The provisional agreement postpones the enforcement of regulations on AI systems deemed high risk while adding new restrictions on AI-generated sexually explicit content.
EU country representatives and parliament legislators agreed to delay enforcement of the EU AI Act under the bloc’s “Digital Omnibus” simplification initiative, according to an official statement from the European Parliament. The agreement also eases restrictions on manufacturing companies while adding a ban on AI systems that generate unauthorized sexual images.
Here’s what changed –
- High-Risk AI System Rules
Before: Requirements for “high-risk” AI systems were slated to take effect August 2, 2026. “High-risk” systems are defined as those that could pose health and safety risks or endanger basic rights. These AI developers would be required to implement risk management systems, maintain technical documentation, ensure human oversight, and meet testing and monitoring obligations.
After: The agreement delays application of these requirements until December 2, 2027. Notably, manufacturing companies will, for the most part, be exempt from these requirements as existing EU product safety legislation already imposes requirements.
- Watermarking requirements for AI-generated content become enforceable starting December 2nd, 2026, moving up from February 2, 2027.
- “Nudifier” and child porn AI apps banned. AI systems that generate child pornography or undress identifiable people without consent will be illegal and enforced starting December 2nd, 2026.
The changes are part of the European Commission’s broader effort to simplify EU digital regulations after businesses and some member states argued overlapping compliance requirements created legal uncertainty and excess administrative red tape. Negotiations had previously stalled over whether AI systems already regulated under sector-specific laws should also remain subject to the AI Act.
The provisional agreement requires formal approval from the European Parliament and EU member states before taking effect.

