Fog Creek Software
Discussion Board




Poll: Database size/config, How do you compare?

Answer these questions (if you feel inclined)

Which DB do you use?
Sql 2K Enterprise

Which OS?
Win2k AS

How large is biggest db?
22GB Data
20GB TxnLog

Txns Per Day?
30-50K

What hardware?
Dual Xeon P2 + 3GB RAM + 4 Disk RAID 5 Array

Anon
Wednesday, August 11, 2004

Quote:

"How large is biggest db?
22GB Data
20GB TxnLog

Txns Per Day?
30-50K"


Wow, I'm not a dba, but at 30-50k of transactions per day:

What kind of recovery model do you have?

When was the lastime you backed the transaction log(assuming you are backing it up)?

anon-88
Wednesday, August 11, 2004


I used to interact with a 200+GB database on a big AIX...  96 nodes and 96GB of ram.


It was for a large government agency and we were able to do some great stuff with it.

I used to run Set@Home during off hours...

KC
Wednesday, August 11, 2004

FYI, this info is for current clients. I've worked on larger in the past.

==>Which DB do you use?
Sql 2K Enterprise

==>Which OS?
Win2k Advanced Server

==>How large is biggest db?

240 GB Data
80  GB Log
20  GB TempDB

==>Txns Per Day?

400K 500K (ish)

Around mid-month and end-of-month (accounting close cycles) it gets upwards of 1M.

==>What hardware?

8-way Xeon (currently only 4, but with room to grow <grin> 4GB RAM + Multiple disk arrays in various configurations w/ around 0.8TB (too lazy to add 'em all up.)

==>Wow, I'm not a dba, but at 30-50k of transactions per day:

Nothin' at all. I do 10 times that on a "normal" day, and far more on a busy day.

==>What kind of recovery model do you have?
We do:
FULL BACKUP -- Daily (in the "wee hours" maintenance window)
DIFFERENTIAL -- 6 times a day -- every 4 hours
TRANSACTIONAL -- every 10 minutes.

When was the lastime you backed the transaction log(assuming you are backing it up)?

Oh ... depending on what time it is, a maximum of 10 minutes ago. At our transaction throughput, even that is waiting far too long. If you miss 10 minutes of transactions, you've lost a sh*tload of data.

FYI -- Insurance industry (claims processing) 5.5 Million casual users (once or twice a year (the insured members)) and 3500 (ish) "heads-down" (all day, seven days a week) users (providers in the network entering claims). Continuous (all damned day) batch-oriented data feeds (incoming and outgoing) from/to TPAs, Insured Groups, Providers, Partners, Vendors and Suppliers, upstream systems, downstream systems, yadda-yadda-yadda, and a partridge in a pear tree.

Don't ever let anyone tell you MS SQL Server ain't up to the task. They're dead wrong.

BTW, not only is my database bigger, but my Sausage is too <grin>

Sgt. Sausage
Wednesday, August 11, 2004

Which DB do you use?
sybase , fame

Which OS?
linux, solaris

How large is biggest db?
big

Txns Per Day?
millions

Tom Vu
Wednesday, August 11, 2004

Great thread, btw -- gives me some real world examples to gauge where we're at in our shop.  We don't have a DBA; we have a coder (me) wearing that hat.

Anyway:

MSSQL Enterprise
WIN2K Server Advanced
Largest DB:  ~12GB
Log size: 4GB
Trx / Day: 1K--2K, not counting batch stuff to get data from a legacy system
Full backup: weekly
Differential: nightly
Trx Log: every 2 hours

Obviously we are at the other end of the scale from the large examples above, but this is the company's first MSSQL based project.  Anyway, as stated above, it's good to see other examples to gauge our setup against.
Hardware: 4 way Xeon, 4GB RAM, sufficient disk space

OffMyMeds
Wednesday, August 11, 2004

Answer these questions (if you feel inclined)

Which DB do you use?
XBase (Clipper)

Which OS?
Novell Netware 4.11

How large is biggest db?
300MB

Txns Per Day?
Don't know. 50-100? (except on monthly statements day)

What hardware?
P120 + 32MB RAM + 3 Disk RAID 5 Array

Well, you asked :)

Chris Altmann
Thursday, August 12, 2004

The Sgt is right, SQL Server can handle some fairly heavy duty.

I've done near-real-time carrier telecoms billing on it and it was fine, even with millions of calls (transactions) per day.

It was near-real-time to allow fraud and credit limit monitoring, rather than the more traditional approach of batching everything up and then doing a massive "bill run" at the end of each month.

Nemesis
Thursday, August 12, 2004

>>Which DB do you use?<<
Sybase ASE

>>Which OS?<<
Linux, Solaris

>>How large is biggest db?<<
2.2TB Data (this is DW so I'm cheating)

Largest OLTP
60GB Data
10GB TxnLog

>>Txns Per Day?<<
On a good day: ~80-100K
On a bad day: ~500K ;)

>>What hardware?<<
Linux: Dual PIII Xeon, 3GB RAM
Solaris: SunFire V480, 4x1.2GHz CPU, 8GB RAM, "big-ass" RAID attached storage

Sgt. Sausage's may be bigger, but I outrank him! :)

Captain McFly
Thursday, August 12, 2004

==>Sgt. Sausage's may be bigger, but I outrank him! :)

<grin>

Hoo-Ah

Sgt. Sausage
Thursday, August 12, 2004

LOL,

I wan't commenting about MSSQL's ability to scale.

I was trying to make a subtle comment about the size of the OP's transaction log in relation to his Data file.

We also run SQL 2K, but we constantly backup  the transaction log through out the day.

I've noticed that in general everyone who posted details has thier log file at ~1/4 the size of the data file. Which jus happens to be the same ratio here at work.

Personaly I love using SQL 2000. 6.5 Was horrible, 7 was a major improvement.

I'm looking forward Yukon with a more rubust T-SLQ with better exception handling,  the embeded .Net CLR and native xml support.

anon-88
Thursday, August 12, 2004

*  Recent Topics

*  Fog Creek Home