Fog Creek Software
Discussion Board




Submitting large number of files with CVS BUGID

As part of a bug fix I did a source code merge between branches in our CVS repository (this involved modifing approx. 300 files). As a matter of course I added the BUGID in the CVS commit message. After saving my message I proceeded to finish the commit and the effects were quite unexpected (to me anyway). Our fogbugs server (G5 mac) groud to a halt and thrashed the hard disk for half an hour. The commit succeded but failed to report in Fogbugz all the modified/added files.

The perl script used to update from CVS is as provided by Fogbugz.

Has anyone else experienced this problem? We make merges of this size quite often and it would be nice to keep track of changed files, no matter how long the file list is. However I don't want to repeat this experience every week. 

Lloyd Newby
Wednesday, March 10, 2004

If you commit two files, it works correctly?

Michael H. Pryor
Fog Creek Software
Wednesday, March 10, 2004

Yep, a commit with just two files works file.

Lloyd Newby
Thursday, March 11, 2004

oops I ment ..... works fine (not file)

Lloyd Newby
Thursday, March 11, 2004

I suspect its because it hits the FB server for each file you check in..
So lets say it takes your web server 3 seconds to serve a page from start to finish...  300 files * 3 seconds = 900 seconds.  That's 15 minutes.

The only other way is to modify logBugData.pl and cvsSubmit.asp to accept all the file info at once.

Michael H. Pryor (fogcreek)
Fog Creek Software
Thursday, March 11, 2004

It shouldn't be taking anywhere near that long on a G5. Is the machine otherwise heavily loaded?

I am not using logBugData.pl - I wrote a replacement in python that is run on our perforce server. I haven't noticed submits failing or taking an onerous amount of time to show up. (Our server hardware isn't as fast as yours - I *think* we are running FogBUGZ on a Dual 867MHz G4.)

Jim Correia
Tuesday, March 16, 2004

*  Recent Topics

*  Fog Creek Home