Fog Creek Software
Discussion Board




Best current Backup Solution for SQL Server?

Hey all. 

I have a SQL Server app that, in total, is approximately 30tb.  The DB itself is a small control DB, while most of the file size is the actual linked files on a separate file system, which also needs to be backed up.  Can someone recommend a backup strategy that would back up the SQL Server 2000 DB and the fairly large file system?

Thanks,

Steve in Norcal

S. Walker
Tuesday, April 27, 2004

You have a 30 tb system and are just now thinking about backup?  *shudder*

Infinite Monkeys
Tuesday, April 27, 2004

Tivoli Storage Manager with Tivoli Data Protector for SQL Server.

I don't know of anything else that will do the job.

Mark Smith
Tuesday, April 27, 2004

"*shudder*"

I'll second that.  Get ready to spend 10's of thousands.

Mike
Tuesday, April 27, 2004

10s of thousands? I would suspect 6 figures should be expected.

I find it really hard to believe that someone has TB of data and hasn't figured out a backup strategy.

Dennis Forbes
Tuesday, April 27, 2004

No way, this must be a wind-up, or maybe just a typo ? Did you really mean 30TB, as in thirty terra-bytes ?

If you haven't got that backed up yet, how do you know it's still intact ? You may have lost a few Gig and not even know it.

Its going to be v.expensive to back that lot up in any sensible timescale, especially if you want to keep the tapes, i.e. not just rotate on a grandfather-father-child basis.

Can you recreate the files if you found your backup system had failed (they do fail sometimes) ?

Steve Jones (UK)
Wednesday, April 28, 2004

Assuming you can afford a few seconds of scheduled downtime for the backup
- Stop updates
- Use the buildin SQL Server backup to backup the DB to the filer
- Use filer facility to prepare backup filesystem (snapshot)
- reenable updates

Is this for real? Don't get me wrong, it wouldn't surprize me to find 1TB warez stashes in some kids' dormroom, but 30TB seems a bit rich to need to come to JoS for backup advice.

Just me (Sir to you)
Wednesday, April 28, 2004

The 't' key is just above the 'g' key in qwerty layout so maybe it was just a typo :)

James 'Smiler' Farrer
Wednesday, April 28, 2004

Wow. 

It is, in fact, a 30 terrabyte system.  It is, in fact, in production for one of the largest states in our fine union.  It's an archive system for documents.  It's been in production for 3 years.  It's grown immensely, and "they", the bureacracy, have refused to pay for a back up solution.  Now, they've decided it's time.  I'm a contractor brought in to solve this and other problems. 

You guys are funny.  Are you all so naive to think that there are not dolts our there with much larger systems that are never backed up.  "That could never happen"  "No on would ever be that stupid".  Of course it could (and is/has).  Of course they would be that stupid.

I find it hard to beleive that you all find it so hard to beleive.  "I can't beleive someone would invest billions of dollars in something that's not going to be successful long term (do the words dot.com ring a bell)

But, please, I still need some input other then "damn, that was stupid!".

Thanks,

Regards,

Captain Terrabyte.

S. Walker
Wednesday, April 28, 2004

Interesting replies to this thread.

Hello… McFly… What perfect world do you guys live in. What smoke and mirrors does your Executive Team have you looking through? Are you the same group of people that said two digit years would be enough? The same people that coined the phrase 512 aught be enough memory for anyone? Do you think that every production environment is protected under the holy grail of the ideal situation with five nines and redundancy? (Go ahead, dig out your yellow book to look up five nines… I’ll wait here) Mind you, not a smart path some people / groups / organizations get themselves on, however it happens. And if you don’t think it does, wake up and smell the bits.  I for two, can attest from several prior lives in the technology business across a variety of sectors, it happens all the time. I’ll refrain from the specific scenarios and occasions in their thread.

Oh, Yes, Yes, Yes, of course it was a typo. Indeed some astute colleague figured out that on his specially designed “QWERTY” keyboard, (and I welcome comments from those who use non QWERTY keyboard users) found that the G is close to the T. Let me guess, you are all users of the DVORK keyboard layout, right? Charming.

30 TB (and no, that wasn’t a typo, that was a T) is a decent size of data. It’s not hard to accumulate 30TB (and no, that wasn’t a typo, that was a T) worth of data. It's not like he asking the simple minds to calculate a google of data. Particularly in an Enterprise environment, this is a reasonable amount of data. Some SOHO / Home users are approaching 1 TB (and no, that wasn’t a typo, that was a T) of data. Welcome to 1999 folks, 700GIG drives are readily available on your local computer stores shelves.  Go buy a stack of them and ponder what's next.... 

So, back to the point. A backup solution for an Enterprise SQL database, roughly 30TB (and no, that wasn’t a typo, that was a T) in size. EMC’s Enterprise Data Manager does a great job of it. Tivoli as well has a product that will adapt to most environments. I’d start with those. (Until of course you get a reply from someone saying he’s got an HP 4GIG SCSI tape drive on ebay and knows a buddy with a surplus of tapes that he klept from his last .com that ran out of funding) and he can 'hook you up'

And before you <flame> me. Relax, my sense of humor appeals well to most of the 132’s out there. So remember two words, Simma Down.

Call Me Daddy
Wednesday, April 28, 2004

S. Walker,

Best practice for backing up the SQL Server db is to use the build in backup facilities of SQL Server to make the backup to the file system, and then to backup that backup with the rest of the files.
During the backup of the DB you will want to disable updates to the system, so as to be able to take a consistent snapshot of the filecollection and its index. As you say this is a small DB, the backup should take just a few seconds. Once made you can snapshot the filer and reenable updates. Then you can backup the snapshot asynchronously.

Just me (Sir to you)
Thursday, April 29, 2004

Call Me Daddy,

"Particularly in an Enterprise environment, this is a reasonable amount of data."

Did anyone say that it was an unfathomable amount of data? I can't see anyone saying that. Having said that, it isn't an everyday amount of data apart from horribly designed databases with large amounts of oversized rows and elements. I say this having been involved with large scale core systems for a large telecommunication and financial company, and in both cases we didn't hit even a single TB. Followup posts by the OP indicate that they're basically using the database as a filesystem, which makes the size a bit more credible.

Having said all of that, I'm unsure why a couple of the recommendations included disabling updates or taking the system offline : SQL Server, since version 7, fully supports online backups.

What people are surprized by is the idea that an organization could get to the point where they've amassed 30TB (which isn't a collection of your mythical 1999 '700GB' hard drives - 30TB in an enterprise setting is a large amount of storage, often requiring a massive[ly expensive] SAN. That's almost a 1000 of the standard enterprise 36GB hard drives), but never did they get a proper database admin to actually make a plan for backing it up. It's like asking when you're a $1 billion dollar sales company "does anyone know how to do accounting?" The two facts are hard to correlate.

Dennis Forbes
Friday, April 30, 2004

Which two facts are you attempting to correlate?

You mention no facts in your 'reply' of which you are correlating.

CMD
Sunday, May 02, 2004

Dennis,

the "SQL online backup" isn't the issue. It is the synchronisation between the files (which are as stated outside of the DB, not stored as BLOBs inside) and the DB. Since a transaction needs to cover both, you have to disable the updates on the app level (not the DB) to get a backup in a guaranteed consistent state.

Just me (Sir to you)
Monday, May 03, 2004

Sorry, cmd, apparently in your 'reply' you couldn't 'read'. Let me recap the pertinent points that you missed.

"an organization could get to the point where they've amassed 30TB...but never did they get a proper database admin to actually make a plan for backing it up."

You see how there is two distinct points there? Those two points, in most organizations, don't exist in unison, though apparently they did in this case.

Dennis Forbes
Monday, May 03, 2004

Sir,

Indeed, I misread the initial request. My hasty interpretation was that it was filegroups that were stored in a separate filesystem.

Dennis Forbes
Monday, May 03, 2004

Your should be in sales.

CMD
Monday, May 03, 2004

*  Recent Topics

*  Fog Creek Home