South Africa pulled its AI policy after fake citations broke the draft

Original: South Africa withdraws AI policy due to fake AI-generated sources View original →

Read in other languages: 한국어日本語
AI Apr 27, 2026 By Insights AI 2 min read 1 views Source

A national AI strategy is supposed to set rules for trust. South Africa’s first draft instead failed the most basic test: whether the references were real. Reuters reported on April 27 that the government withdrew the document after fictitious sources in its bibliography appeared to be AI-generated, turning what should have been a policy milestone into a credibility crisis.

The draft had much bigger ambitions than a routine consultation paper. It was meant to position South Africa as a continental leader in AI while setting up the institutions that would shape how the technology is governed. According to the Reuters report, the proposal included a National AI Commission, an AI Ethics Board and an AI Regulatory Authority, alongside incentives such as tax breaks, grants and subsidies to spur private-sector collaboration.

That is why the failure matters. Communications and Digital Technologies Minister Solly Malatsi said the most plausible explanation was that AI-generated citations had been included without proper verification. He also said the lapse was not merely technical and had damaged the integrity and credibility of the draft. The document was pulled not because of a disputed policy detail, but because the process behind it no longer looked reliable.

There is also a sharp symbolic lesson here. Governments around the world are trying to write rules for AI deployment, safety and accountability. If a national AI framework cannot clear basic source verification, every claim inside it becomes harder to defend. That is especially costly when the same draft is asking the public to trust new regulatory bodies and public incentives.

Malatsi said there would be consequences for those responsible and did not provide a date for a replacement draft. For policymakers elsewhere, the message is blunt: generative tools can speed up drafting, but they cannot replace human verification when the document is meant to govern everyone else’s use of AI.

Share: Long

Related Articles

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.