Fog Creek Software
Discussion Board

SQL Server time outs

Have you guys ever ran long running commands on a distant SQL Server? And your connection disrupted either due to overly restrictive firewall rules (to time out after say 30 minutes for any long running connections) or TCP disruptions or just plain ADO bugs?

Tuesday, May 6, 2003

30 minute queries? Wow.

Somewhere...out there...there is a SQL Server screaming for mercy....

Mark Hoffman
Tuesday, May 6, 2003

1) There's something wrong with your query. :)

2) assuming there *isn't* something wrong with your query, write it as a stored procedure that builds a temporary table. Then you can invoke it and it won't matter if you're disconnected.


Tuesday, May 6, 2003

Wow, that's a hellofa query.  I HAVE had wierd problems where it seems like a query tried to happen before the connection to the database was complete.  Didn't happen on Win2K machines but on NT4 boxes the code broke.  Turns out I had to mangage the connection state as it was connecting and/or executing.  Seems the real world is different than the nifty demos and what not that you see.  Feh.

Greg Kellerman
Tuesday, May 6, 2003

Sure you aren't asking for the cube of all rows by all rows?

Simon Lucy
Tuesday, May 6, 2003

I guess it depends on what 'operation' you are running on the remote SQL Server.

If you are running a query then I would check the SQL and make sure that it is not producing a Cartesian Product.  As Philo mentioned, you may also try writing the query as a stored procedure and if it is a large query either limit the result set or use temporary intermediate tables.

As far as other operations that you are executing on the server through ADO, I can't say.  Maybe you could share the exact 'operation' you are runnning or maybe try executing the 'operation' through a Query Analyzer session.

Dave B.
Tuesday, May 6, 2003

To answer the question, yes, it happens to me from time to time. Those of you that can't imagine a 30-minute query have never done batch processing :).

What you'll generally have to do if you lose the connection is to reconnect, kill the spid, and try again. I had a batch job that ran every night to process the daily 500-million+ rows we get. It can take anywhere from 15 minutes to a couple hours, depending on other factors.

What I eventually did was this: I wrote blocks into the code to update a status table showing job progress, and then wrote a wrapper proc for the master batch routine. This way, I can remoely kick off a job that calls the master routine, and check its status by looking at the status table... I don't have to stay connected for the thing to work. It also makes the job run faster because it's not having to return status info to a remote client.

Troy King
Tuesday, May 6, 2003

I have worked with stored procedures that took days to run.  Not every application involves a web interface pulling data from a database.

Our solution was to divide the work up into multiple parts.  If the procedure failed, at most you would have to start at the beginning of one of the parts.  Another solution is to use Terminal Services and run the query on the server itself.  You can then reconnect when needed.

I would also look into having someone fix the poorly configured firewall.  While closing extremely long connections is a good idea, it's also a good idea to configure the firewall to reset the timer for the connection everytime it sees an ACK if possible.  Increasing the timer to 5 minutes everytime an ACK is seen is much better than waiting 30 minutes.

Tuesday, May 6, 2003

>> "Not every application involves a web interface pulling data from a database."

Where do you glean your extraordinary knowledge from?  Someday I'll be as wise and maybe even as smart as you.  You are a true genius.

Tuesday, May 6, 2003

Try Data Transformation Services (DTS) and run it on the server or a server that's local to the SQL Server's network.

Tuesday, May 6, 2003

Couldn't you just set up a SQL Agent job to organise this?

Wednesday, May 7, 2003

*  Recent Topics

*  Fog Creek Home