Also to whoever is obviously developing a new Gemini crawler (hitting new URLs every 0.1s) - respect the robots.txt. It's a mutual respect thing. Please.
6 months ago ยท ๐ lufte, justyb
Actions
3 Replies
I mean, yeah, I wrote my own rate limiter. But that's not an excuse for people to misbehave (Gemini is small enough) ยท 6 months ago
ugh, tell me about it. These things get stuck in Gemipedia, or NewsWaffle, which have virtually an unlimited number of URLs. It pounds my capsule. Some of these are crawlers in Geminispace, but sometimes its a web crawler, hitting my capsule through @mozz's HTTP-to-Gemini portal (or another one of the handful of public portals out there...) ยท 6 months ago
not it ยท 6 months ago
Source