Fog Creek Software
Discussion Board

Knowledge Base
Terry's Tips
Darren's Tips

FTP Problem

I've had a couple of problems with the publishing of a site recently - relating to the FTP portion. Before you say anything though - it's not the problem that was resolved by SP2. The truth is it's not exactly a CD bug, but one of those awkward situations where the FTP server isn't great, the internet drops the ball every now and then and CD is the only thing left to pick up the pieces, which it can't quite do.


1) The site I'm publishing to doesn't support PASV mode.
2) Therefore the server needs to get back in touch with me to receive files from me.
3) This seems to timeout every now and then.
4) If it times out, then citydesk.xml doesn't get written and CD has no idea how far it got before.
5) Next time I FTP, it has to do everything again, and again, and again until I succeed. After that it's ok because the chances of failure on an average update upload are far less then when uploading 450 files for the first time.
6) This is a pain. One of the great things about CD is of course that you can put your trusty FTP clients away.

Supporting info:

1) I have an ADSL router employing NAT. I don't think this gets in the way because it works intermittently for CD and fine for other FTP clients and...
2) I have zonealarm (not enabled when publishing - just incase), which when switched on (after a failed publish), detects incoming packets from the FTP server for me on expected port 21.
3) My reasoning is therefore that there is simply a delay either because of server or network (irrelevant) which screws everything up.


1) Preview locally, use FTP Client to upload.

Possible Solutions:

Not wishing to teach granny to suck eggs or anything - but I will anyway :-)...

1) Employ a timeout, detect and retry say 3 times before aborting. Using other software, i got an average 6 failures per 450 files. In each case though, the second attempt for these files worked.
2) Write an updated XML file every 20 (perhaps configurable) files or so - at least when publishing 450, you're only ever an average 10 files behind. I know it slows successful loads down a bit, but i'd rather it bullet proof than fast.

I can give you the server details if you want to try it... just drop me a mail.

Hope this stuff helps



Sam Strachan
Wednesday, November 27, 2002

I've reported a similar problem before (my uploads kept being interrupted because I was on a flaky dial-up connection). The result was the same - it took about 10 tries (which took several hours) before I got the whole site to upload without the connection dropping out.

At the time, I suggested a 'checkpoint' for uploads, which is the same as your idea of uploading the XML file after every 20 files or so. I think it was Joel that replied, and he seemed to like the idea, so hopefully it will make it into the next version of CityDesk.

Darren Collins
Wednesday, November 27, 2002

Ah that would be cool. In the end, having manually FTPed a preview of it, I published to an entirely different server to get my hands on the citydesk.xml. Then I copied that too.

It all works a treat now. So I'll just remember to do small incremental updates for now.

Sam Strachan
Thursday, November 28, 2002

Or write the XML file as a journal/log (just append one entry after each file, and finish the XML file after the last file). Citydesk will then have up to date information on all articles rather than missing somewhere between 10 and 450 articles....

In that case, it could prompt for a retry, ignore, cancel, or retry [x] times when publishing fails.

Adriaan van den Brand
Friday, November 29, 2002

*  Recent Topics

*  Fog Creek Home