Fog Creek Software
Discussion Board




Welcome! and rules

Joel on Software

ASP.NET pages on server unavailable?

We have a Windows 2000 web server that runs a variety of sites, most of which are straight HTML or ASP.

We have recently added a couple of new ASP.NET sites that use ADO to connect to a SQL databases on another server. A couple of times a week, these new ASP.NET sites suddenly become unavailable (the client browser times out).

Its not the entire site, either. The welcome page is pure html and never has a problem, but as soon as I click on the login button (which is an ASPX), it times out.

Like I said, it happens a couple of times a week. The fix is to reboot the web server, but there has to be an explanation for what is going on.

Any ideas you might have or directions I should start looking in would be greatly appreciated.
Brad

BradC
Thursday, November 11, 2004

I would start by checking the error logs, if any. That should give you some clues.

Nemesis [µISV]
Thursday, November 11, 2004

Check if your connection pool to the SQL Server is all used up and you're getting timeouts on new connections.  Often this happens if you're using ADO objects and leaking connections (not closing, disposing, etc.).

smallbiz
Thursday, November 11, 2004

Thanks, guys.
Here's an update.

One of the two ASP.NET sites is a pretty straight-forward site that uses a database to store the page content separate from the layout and style, and allow customers to pull up information on several different services.

The OTHER web app, however, is a web app designed to query a very LARGE database and return query results either on the page itself, or by means of a downloadable XLS file.

Turns out I didn't have anything in the second application preventing users from trying to run a query on the entire recordset (over 280,000 records). Any query that would have returned results of about 35,000 or over was locking up the aspnet_wp.exe process, and the client page was timing out before any results were returned.

Rebooting fixed the problem because it killed the aspnet_wp.exe process.
I fixed it temporarily by hard-coding a limit of 32,000 into the query results, which processes and returns a page in a little over 2 minutes.

SO, my next problem: (several problems, actually)
1) How do I keep big giant queries in one ASP.NET app from affecting the performance of my other applications?
2) How can I show users running queries a "query in progress" page while data is being retrieved? Any easy way to actually put a progress bar on it?
3) Is there any way to ALLOW retrieval of very large data sets without the client timing out? My client says that he would like to push the limit up to at least 65,000.
4) Are there better ways of retrieving large datasets than how I'm doing it?
(I'm giving the user a choice--display the results in an on-page table with optional paging, or download as XLS. To download as XLS, I'm taking the results of the table html, pushing it into a datastream, and rebadging it as XLS. Excel is smart enough to open it up correctly even thought its really only a HTML file with a giant table inside.)

I know that these are not easy questions, but any ideas you might have would be greatly appreciated.

BradC
Friday, November 12, 2004

"1) How do I keep big giant queries in one ASP.NET app from affecting the performance of my other applications?"

The answer, unfortunately, is to run Windows Server 2003 and use application pools to isolate the applications from each other.

"2) How can I show users running queries a "query in progress" page while data is being retrieved? Any easy way to actually put a progress bar on it?"

The progress bar would be meaningless, since you don't know how long it really is going to take, but for an example of progress pages, do a flight search on Expedia. What they're essentially doing is returning an HTML page that looks like it's doing something (i.e., image motion) while then sitting and waiting for results.

"3) Is there any way to ALLOW retrieval of very large data sets without the client timing out? My client says that he would like to push the limit up to at least 65,000."

How could a user possibly view 65 THOUSAND results in an intelligent way, on a single page? I don't get it.

Brad Wilson
Friday, November 12, 2004

Not that they would use those results on the web page, but they would want to return a query as a downloadable XLS file and play with the data on their local machine.

At this time, however, I'm doing this really inefficiently. The web server is actually retrieving the results in an HTML table format, which it then puts into a datastream for download. So most of the work is done by the webserver.

Is there a way I can just submit a command to the SQL server for IT to output the results in a downloadable format, maybe by FTP?

BradC
Friday, November 12, 2004

Isn't it quaint how 280,000 records is a very large database. Try a few billion rows, then we can talk large ;-)

I'm not sure there is a good solution to this. Downloading a dynamically generated .xls with tens of thousands of rows is going to take a long time.

Is it possible that your client can connect to the database directly ? You can run the SQL Server tools (Query Analyzer, etc) over the internet, by connecting to an IP address. This would allow them to access the data, probably through a view or a stored procedure, as quickly as possible. Once they have the data they want, QA can easily save it as .CSV, etc to open in Excel.

Maybe this is not possible, but it might be worth considering, as you'd remove the load from the ASP.NET server.

Nemesis [µISV]
Friday, November 12, 2004

Sorry my database is not as big as yours. I compensate by driving a sports car :)

Well, the whole design of the web app was to provide a simple "Query Wizard" with some predefined options instead of allowing full blown do-whatever-you-want access to the data.

I wonder if there is a way to get SQL to export results directly to a file share or FTP site, then just have the web page point to that file? That might be an ideal compromise.

BradC
Friday, November 12, 2004

You can get SQL to dump data to a file, using a "BCP out" operation. However, a more modern alternative would be to use DTS, I suppose (showing my age by knoing about BCP, I suspect).

However, it would not be under control of a web-page. It could be, but that'd be a non-trivial problem to solve. I believe it is possible to set up and run a DTS job programmatically, so it should be possible, but not easy.

This could accept parameters and then create the DTS job, and execute it, which would run the query and dump the results into a file. You might have syncronisation problems, but it should work.

BTW, which sports car do you drive ? I worked for a telecoms company, so billion row tables are commonplace ;-)

Nemesis [µISV]
Friday, November 12, 2004

Yes, DTS can convert to a file. You can also use the FOR XML to write as an XML file.

Two other options to think about:
1. Have some sort of message queuing system - the web page adds an item to a queue, and a third system on another machine handles the file creation.
2. Asynchronous processing - fire off a process. Make sure you don't overwhelm the system as you fire off thread after thread.

Bryan Jonker
Friday, November 12, 2004

I have exactly the same problem. Instead saying why it needs to be done or how it would make sense to the users etc, can somebody answer number 3.

3) Is there any way to ALLOW retrieval of very large data sets without the client timing out?

Is is to do with the server power ? or is it to do with simply setting some properties like: SqlSelectCommand.CommandTimeout
and Connection Timeout

At the moment i used all such things and dont work, my pages still time out which makes me think it is the server that is causing this.

Zach
Thursday, December 02, 2004

Problem : when  we implement custom paging in sqlserver

Problem description :  suppose our query return a very large number  of records like 2 lakh so how can we implement paging in sqlserver side to handle that so much large database withou affecting its performance

gaurav
Wednesday, December 08, 2004

*  Recent Topics

*  Fog Creek Home