Fog Creek Software
Discussion Board




Test Web Page Download Times

Does anyone know of a tool that will let you test the time that it takes to download a complete web page (including images, scripts, css)?  As an example, earlier versions of Mozilla used to have a "Document:Done (3.20 seconds)" on the status bar. This included the time for everything to download and for the page to render.

Ideally I am looking for something that would let you test a large number of pages (e.g. a full site). Obviously I could write a script to do this but why re-invent the wheel if it is out there.

Thanks in advance.

billm
Wednesday, November 06, 2002

i think it's dreamweaver that says how long the page will take to download while you're designing it.

nathan
Wednesday, November 06, 2002

FrontPage too.  Don't know how accurate it is but it's on the lower right corner of the screen.

tk
Wednesday, November 06, 2002

I've had to do this sort of thing pretty often.

I have some commercial tools that have this feature, but they're probably a hell of a lot more expensive than you want to go for.

Actually a wonderful tool that will get you 'actual' vice 'computed' ( i.e.'inaccurate')  download times for a page and all its dependent resources is 'wget'. It's available recently for the wintel platforms as well as *nix.

What I have done in the past is to run it with its 'verbose' switch on (the default, actually), and capture the normal output (defaults to stderr, but you can send to logfile of your choice with the '--output-file=logfilename' option). The format of this logfile is pretty easy to follow -- the page comes first, then all the resources found on that page, then another page, it's resources, etc. it keeps an elapsed real-time stamp with each request, as well as effective download speeds for each element downloaded.

Don't know if I can find it or not anymore, but I wrote a perl program once to read this log and prepare some results I wanted from it. As I said, it's not formatted too badly and you should be able to knock out a perl program to extract the info you want without too much effort.

Of course, if the site in question isn't very long, then you don't need a post processor program, just read the log directly <g>.

wget also has some time stamp format options you can play with, and also has a built-in bandwidth throttling option, though I've never played with it and don't know how realistic it is. It's also a complete spider, with lots of features to adjust filters and depths, it also does ftp, and supports various flavors of cookies, you can alter the useragent string.... the list goes on. It also does, actually, download the pages, so you'll get copies on your local box. I do not believe, however, that it supports a client-side cache, so you want to keep this in mind if you're trying to get a realistic picture of how pages in a site will actually perform. It will, I believe, skew your results higher because it'll always get all the resources, which clearly a real user with caching enabled won't experience. So for a site, its numbers will be higher than say, explorer would generate.

It's a tool I always keep around.

Just guessing your needs here, but in the event you're interested in doing this over various connection types, you'll start thinking about how I can simulate a 56k modem from the box you're running wget from on your nice 100 mbit/T-1 lan/wan. Probably the easiest and most accurate method to do this is simply to load wget onto a laptop and use a dial-in account over a real 56k modem. I've futzed around with various bandwidth - throttling alternatives, and found at the time that there were problems or artificialities that just made the laptop & modem method much easier and better.

Another free alternative to timing, which has the advantage of accounting for client-side caching behavior is OpenSTA. It's on sourceforge, and is designed to be a corba-based distributed performance testing tool. It's got it's advantages and disadvantages, which is too long a topic to go into here. But, it also records in a very easy-to-read and easy-to-parse script file, timings for all resources, just like wget does. The difference is that it is a local proxy between your local actual netscape or ie browser, and whatever your browser's caching options are set for, is what gets used, so the timings from OpenSTA scripts will reflect the effects of client-side caching, and might give you a better overall picture of actual page performance than wget will. Especially because being a testing tool and not a spider, you can explicitly control what pages you hit, and you can put the thing in a loop, collecting stats over time for a page.

So, there's another option for you to kick around. You may want, BTW, to keep OpenSTA around to use as a debugging tool. Won't go into details here, but it captures a hell of a lot of info about the http 'conversation', as well as resulting page structure (i.e. DOM), any of which might prove useful in figuring out what's really going on.

HTH, cheers,

anonQAguy
Thursday, November 07, 2002

You can build one pretty easily - use the browser control in visual basic.  time it between begin navigate and document complete.  take about an hour.

Kevin
Monday, November 11, 2002

This page might help:

http://holovaty.com/tools/getcontentsize/

Caveat:  I'm not sure if that script gives accurate results.  I tested it on a few pages, and the numbers seemed much smaller than I would have expected, based on the sizes of the images involved.  I sent an e-mail to the guy who wrote the script, but so far I haven't heard back from him.

J. D. Trollinger
Monday, November 11, 2002

*  Recent Topics

*  Fog Creek Home