Cloudflare says its network crossed 500 Tbps as AI crawling reshapes Internet traffic

Original: Cloudflare’s global network has officially crossed 500 Tbps of external capacity, enough to route more than 20% of the web and absorb the largest DDoS attacks ever recorded. https://cfl.re/3Ob4hmw View original →

Read in other languages: 한국어日本語
AI Apr 10, 2026 By Insights AI 2 min read 1 views Source

What Cloudflare announced

On April 10, 2026, Cloudflare said its global network passed 500 Tbps of external capacity. The company defines that figure as total provisioned external interconnection capacity across transit providers, private peers, Internet exchanges, and Cloudflare Network Interconnect ports in 330+ cities. It is not peak daily traffic. Cloudflare explicitly describes the unused headroom as part of its DDoS budget.

The headline number matters because it ties together security, application delivery, and the changing traffic patterns of the AI era. Cloudflare says it now protects more than 20% of the web, which makes network scaling a product story as much as an infrastructure story.

Why the extra headroom matters

Cloudflare connects the milestone directly to threat volume. The post says the company mitigated a 31.4 Tbps DDoS attack in 2025 and blocked more than 5,000 attacks on the same day, with no engineer paged. The architecture described in the post uses XDP, l4drop, the dosd daemon, and the Quicksilver distributed KV system so mitigation rules can propagate across the network within seconds, without shipping traffic to a centralized scrubbing center.

  • Packets are filtered at line rate before they consume application CPU.
  • Mitigation decisions are distributed across servers in the targeted colo.
  • The same edge footprint is designed to absorb attacks and keep developer workloads available.

What AI changes

The post also makes an important AI-era claim: Cloudflare says AI crawlers, model training pipelines, and autonomous agents now account for more than 4% of all HTML requests on its network, and that so-called “user action” crawling grew more than 15x in 2025. That changes the traffic mix the Internet has to handle. Browsers typically load a page and stop; AI crawlers often fetch linked resources aggressively and continuously.

Cloudflare says it distinguishes legitimate AI crawling from attack traffic using verified bot IP ranges, TLS fingerprinting, behavioral analysis, and robots.txt compliance signals. That detail is significant because the AI web is creating a classification problem, not just a capacity problem.

Why this is high-signal

The deeper signal is that large network operators are now talking about AI traffic as a first-order systems design issue. Cloudflare is not presenting 500 Tbps as a vanity metric. It is positioning the number as the buffer that lets it absorb extreme DDoS events, run edge compute, and handle a web where agents and crawlers generate a growing share of requests. For builders, the takeaway is straightforward: the AI stack is not only models and apps. It also depends on network infrastructure that can classify, absorb, and route a very different kind of demand.

Sources: Cloudflare X post · Cloudflare blog

Share: Long

Related Articles

Cloudflare Replaces HTML Agent Errors with RFC 9457 Markdown and JSON
AI sources.twitter Mar 15, 2026 2 min read

Cloudflare said on March 11, 2026 that it now returns RFC 9457-compliant Markdown and JSON error payloads to AI agents instead of heavyweight HTML pages. In a same-day blog post, the company said the change cuts token usage by more than 98% on a live 1015 rate-limit response and turns error handling into machine-readable control flow.

Comments (0)

No comments yet. Be the first to comment!

Leave a Comment

© 2026 Insights. All rights reserved.