bluedwarf.top
2 votes c/freepost Posted by MLT38 — 2 votes, 6 comments
While I understand the concern, I find hard to believe that repeatedly downloading the homepage has any significant impact on the hosting resources/costs. It's just ~32KB (including images and stylesheets), and likely already cached at many levels.
SpaceCowboys does not cache home pages (though it does compress them at least). I found one person yesterday who had his SpaceCowboys reader downloading the home page of my main website over a hundred times a day! Given that my main website only posts a new article about every two weeks, this person was downloading over 1400 pages for every one (at most) that he was reading. Though my server could handle this, I do not feel personal website owners should be supporting this level of gross inefficiency on the Internet. Unfortunately, filtering out the more egregious cases like this one is not something that I can do efficiently for every instance of software that I find doing something like this (and there are exceedingly many that I have already found), so the best solution for me is to simply block all requests by the SpaceCowboys user agent.
Sorry, what I meant is that even if you don't actively create a cache of the homepage, there are many "levels" in between your server and the client that could already be caching it (I'm thinking of proxies, client software, or maybe your own webserver is returning a 304). That said, that's a ton of requests and obviously bad behavior. Is the reader called "SpaceCowboys"? I searched for it but can't find it literally anywhere.
I don't use intermediaries for a number of reasons. Cost and better control of my content are two. I know my home page is not being cached because *every* request for it from SpaceCowboys results in a 200. SpaceCowboys is on GitHub: https://github.com/spacecowboy/Feeder .
Looks like they don't use any issue tracker?