Fog Creek Software
Discussion Board


Reading some stuff on Werner Vogels blog got me thinking about Denial of Service & RSS.

Assuming a website has quite a few RSS feeds, and if all these feeds are pulled simultaneously by a large number of *users*, would it be able to bring down a site?

Is this even possible?

How could this be prevented?


Prakash S
Friday, April 30, 2004

Slashdot prevents you from grabbing their feed more than once per hour. If you do, you get banned for an amount of time.

Friday, April 30, 2004

The RSS feeds are just XML documents.  Do your sites crash when lots of people simultaneously pull down index.html?

It's just yet another file,  no magic DoS powers.  Make sure your site is cache friendly and can handle the load, you'll be fine.

Friday, April 30, 2004

Recently, on Wired News "Will RSS Readers Clog the Web?":,1377,63264,00.html

It seems some hungry RSS readers do not (ask the host to) check if the file to download has changed in the meantime since the last petition. So they ask to download the same file, all of it, every time.

Saturday, May 1, 2004

Yes, but those readers are very few and far between. All the most popular readers do support conditional GETs.

Brad Wilson (
Saturday, May 1, 2004

It's simply traffic, like any normal request.

I think the issue may be, if you are a real smartie pants, then you may have say, 10,000 people reading your blog? ok?

Most of these people are like blog addicts, they use rss readers, not a browser. Most of these readers get the feed once an hour, by default. Which means instead of going to a site once a day, you're going 24 times by default. Now add x000 people, or x0,000 people doing this to your blog.

I doubt it's a real issue anyway, it's not a huge amount of bandwidth really, I'd imagine 99.999% of blogs could handle a hell of a lot of readers on a dsl line.

Sunday, May 2, 2004

10k hits an hour to static content is clearly no sweat. Since 99.999999999999% of the time you'll be service back "304 No Change", then it's REALLY no sweat.

Centralized subscription services like Bloglines or NewsGator Online Services reduces this even further, since everybody shares a single copy of the data.

Brad Wilson (
Sunday, May 2, 2004

thanks guys

Prakash S
Sunday, May 2, 2004

yeah, except the freaking kinja bot keeps hitting my site a few times a day, never gets a 304 response, and keeps trying to also download a non-existant atom feed.

at least it's supposed to honor robots.txt. i might have to stick them in there.

Monday, May 3, 2004

*  Recent Topics

*  Fog Creek Home