Fog Creek Software
Discussion Board

Welcome! and rules

Joel on Software

Profiling websites

I have a website that is pulling quite a lot of data from remote databases, and bits of it run slow.

Sometimes it all works fine though!

I wonder if anyone has any solution to help profile the interaction between the browser page/webserver and database server?

(feel free to patronise me i am not up to speed with this stuff!)

Thursday, January 27, 2005

Using stored procedures can cut down on traffic.

Colm O'Connor
Thursday, January 27, 2005

There is a product called ANTZ which is excellent

Thursday, January 27, 2005

Consider possibly "Cacheing" data that doesn't change very often.  A simple example:

Lets say I have a table which contains Countries of the world.  Countries don't change much.  That being said, it is smarter to store the data on the web server (in say xml) rather than hit the database every time.

Mike McGrath
Friday, January 28, 2005

Use MS Application Center Test (ACT) to gather performance data on your web application. It comes with VS 2003 Enterprise Edition only. Or try Web Application Testing (WAPT) -

Monday, January 31, 2005

If you only want to avoid returning to the database while reloading a page, ASP.NET offers the possibility to keep the current content of the form and not need to fetch it again on a reload with the use of IsPostBack() on the page load event of the aspx page.

This only works when a page is reloaded following a submit, otherwise the IsPostBack() will not be set to True.

Le Poete
Wednesday, February 2, 2005

*  Recent Topics

*  Fog Creek Home