BubbleUp Forum
General Category => General Discussion => Topic started by: RolandoStu on May 21, 2025, 10:33:56 am
-
Over the previous few months, as a substitute of engaged on our priorities at SourceHut, I've spent anywhere from 20-100% of my time in any given week mitigating hyper-aggressive LLM crawlers at scale. 4 to stats, however I need to do this to get that from this man to do this and run over there and do all this shit, it’s the same precise shit that I’d be doing in my regular life if I used to be working or writing music. These bots crawl every little thing they'll discover, robots.txt be damned, together with costly endpoints like git blame, every page of every git log, and each commit in each repo, they usually accomplish that using random Person-Agents that overlap with end-customers and are available from tens of 1000's of IP addresses - principally residential, in unrelated subnets, each one making no multiple HTTP request over any time period we tried to measure - actively and maliciously adapting and mixing in with finish-consumer traffic and avoiding attempts to characterize their conduct or block their site visitors.
Here is my web page ... big cock (http://sunnyside-pbn-domains.club/)