A Blog Under Siege: archive.today Reportedly Directing a DDoS via its CAPTCHA
A Blog Under Siege: archive.today Reportedly Directing a DDoS via its CAPTCHA
TL;DR: Multiple reports indicate archive.today’s CAPTCHA page runs client-side JavaScript which issues frequent, randomized requests to third-party blogs’ search endpoints — effectively generating DDoS-like traffic while the CAPTCHA remains open. See sources and discussion at the bottom.
Per the original writeup, the CAPTCHA page includes a small `setInterval()` loop that uses `fetch()` to call a blog’s search URL with a random string every ~300ms, preventing caching and forcing repeated work on the target server. You can inspect the reported code sample and network activity in the linked source. :contentReference[oaicite:0]{index=0}
setInterval(function() {
fetch("https://example-blog.com/?s=" + Math.random().toString(36).substring(2, 3 + Math.random() * 8), {
referrerPolicy: "no-referrer",
mode: "no-cors"
});
}, 300);
Community discussion picked this up quickly — threads on Hacker News and Reddit include user observations, confirmations, and debate about intent and mitigation. See links below. :contentReference[oaicite:1]{index=1}
Timeline summary
- Gyrovague published the first detailed write-up documenting the JavaScript and timeline. :contentReference[oaicite:2]{index=2}
- Community posts on Hacker News and Reddit reproduced observations and discussed mitigation steps and legal/ethical implications. :contentReference[oaicite:3]{index=3}
- Some blocklists and adblock rules were reported to block the offending requests for users with blockers enabled. :contentReference[oaicite:4]{index=4}
Why this matters
Even small client-side loops can weaponize normal visitors’ browsers into making repeated requests that overwhelm low-budget blogs. When archival UIs (e.g., CAPTCHAs, previews) include such patterns, they can create collateral damage beyond the archive itself.
Quick mitigation checklist
- Rate-limit: Add rate limits to search and other expensive endpoints (return 429 or temporary 503 for excessive rate).
- Lightweight responses: For unknown/random search strings, return a cached lightweight response or a short 204/404 rather than generating a heavy full page.
- WAF/CDN rules: Use WAF, Cloudflare, Fastly, or CDN rules to block repeated identical patterns, or create rules that throttle requests coming without normal referrers.
- Logging & evidence: Capture request headers, timestamps and sample payloads for any abuse reports you file with hosts/registrars.
- Blocklists: Consider recommending DNS/blocklist rules for readers who want to block the offending domains locally (some lists already added entries that prevented the requests for users with blockers).
Comments
Post a Comment