Cloudflare puts Dynamic Workers into open beta for sandboxed AI code execution
Original: We’re introducing Dynamic Workers, which allow you to execute AI-generated code in secure, lightweight isolates. This approach is 100 times faster than traditional containers. https://cfl.re/4c2NvPl View original →
What Cloudflare posted on X
On March 24, 2026, Cloudflare introduced Dynamic Workers as a way to run AI-generated code inside secure, lightweight isolates rather than inside heavier container sandboxes. The headline claim in the X post was straightforward: this approach is 100 times faster than traditional containers.
That matters because agent products increasingly need to execute model-written code on demand. The bottleneck is no longer only reasoning quality. It is whether the generated code can run quickly, with tight permissions, and without exposing the host environment to unnecessary risk.
What the blog adds
Cloudflare says Dynamic Worker Loader is now in open beta for paid Workers users. The API allows one Worker to create another Worker at runtime using code supplied on the fly, which gives agent systems a native place to execute short-lived code safely.
The security model is notable. In Cloudflare’s example, the parent Worker passes only the APIs the generated code should be allowed to use through env bindings, while direct outbound internet access can be disabled with globalOutbound: null. That is a much tighter control surface than dropping model-written code into a general-purpose runtime with broad network access.
Cloudflare also makes a strong infrastructure claim. Because Dynamic Workers use the same V8 isolate model that powers Workers, the company says a sandbox can start in a few milliseconds and use only a few megabytes of memory. The post describes this as roughly 100x faster and 10x-100x more memory efficient than a typical container. Cloudflare also says the architecture can scale to millions of requests per second without separate global sandbox limits.
Why this matters
The broader signal is that agent infrastructure is moving away from heavyweight, reusable sandboxes and toward per-task, per-request execution environments. That shift is important for products that generate code frequently and cannot afford either long cold starts or weak isolation.
If Cloudflare’s performance and security claims hold up in production, Dynamic Workers could become a more practical default for many code-executing agents than container-based approaches. That would make this release an infrastructure story with direct implications for how AI agents are built, priced, and secured.
Sources: Cloudflare X post · Cloudflare blog post
Related Articles
Cloudflare said on March 19, 2026 that Workers AI now supports Moonshot AI's Kimi K2.5. The company is using the model to argue that a unified agent platform can offer both strong tool use and much lower production cost.
OpenAI on March 11, 2026 detailed how it combines the Responses API with a shell tool and hosted containers to give agents a managed computer environment. The company says the design is meant to make file handling, tool execution, network access, and long-running workflows easier to run in production.
Vercel said on March 19, 2026 that it built Chat SDK to remove the platform-specific plumbing that slowed internal agent rollouts. Vercel’s blog describes an open-source public-beta TypeScript library that lets one bot implementation target Slack, Teams, Google Chat, Discord, Telegram, GitHub, Linear, and now WhatsApp through adapters.
Comments (0)
No comments yet. Be the first to comment!