Fog Creek Software
Discussion Board

Maintaining Dev/Prod Database Sync

Hello all.

I'm migrating to a new setup shorly with a production and a development database server.  The databases will be Oracle, and the systems will be Win2k server.

My question is how does everyone here manage to keep their development and production databases in sync after changes?  The situation I'm thinking of is the production database (Prod) is and has been running for a couple weeks since the last update.  Everything is going smooth.  The Development database (Dev) has a few schema changes, small table changes, and procedure updates.

How do I get Dev's changes to Prod without messing with the data?  I can think of a couple ways:

* Write a set of update scripts by hand, that would create or replace the new procedures, and alter the needed tables.  This seems like it would get unwieldy quickly...
* Dump both databases to flat files (just the schemas) and create the alter table statements, etc. off of those.
* Some proprietary solution?
* Store both schemas in some other format, and edit that, then re-create both as needed.

Any ideas/suggestions?

Andrew Hurst
Monday, September 29, 2003

In the past we have tried the following:
* Scripts by hand
* Commercial software

In the end, we rolled our own.  Basically, we created a change management tool specifically for database scripts.  It took about a week to write and works flawlessly.

Monday, September 29, 2003

Try Red Gate ( If you need data synching, you will need to install .NET.

Only $195 and works like a charm. (there is even a trial version which is time-bound).

(no, I do not work for them. Only a happy customer).

Tarun Upadhyay
Monday, September 29, 2003

Note that the red gate products support MS SQL Server only.

This is one of the commercial products we tried.  Loved it at first and then became the bane of our existence after a while.

Monday, September 29, 2003

I use manually created scripts for maintaining both the development and production databases.  I have a process in place to minimize any issues.

My development database contains real data from an older production database (occasional snapshots are taking to update development w/ production data).  Whenever a schema change takes place, I record all the SQL ALTER statements into a script that can be replayed in production.  I never do any manual conversion of the development database -- I always write a script to do it and then I can also replay that in production.

This process is so simple, I can't imagine a need for an automatic tool.

Almost Anonymous
Monday, September 29, 2003

Almost Anonymous, you sound like one developer, one project shop and nothing is wrong with this but try to imagine another case: 20 developers, projects 10 years in the making, 10 project versions, up to 1000 tables, tens of thousand of columns, 5 testers, hundreds of production servers. Try to synchronize all versions with all versions and update all databases, and see how many sets of scripts you have to maintain and how many DBAs you have to hire.

This is the project I am working on and our in-house synchronization tool literally saves us millions every year.

19th floor
Monday, September 29, 2003

19th Floor makes a good point.

Our project only has 5 developers, ~100 tables, ~ 1000 columns and ~ 500 stored procedures.  We have 10 production servers.

Until we automated the process, it was nearly a full time job for one of our developers.  1/5 of our resources was way too much to devote to this task.  One or two weeks of delopment and we had a body back to do 'real work'.  Now it takes about one day/month to manage the process.

Its well worth the investment.

Monday, September 29, 2003

Sybase powerdesigner has the visual model merge which shows any two (physical) data models -  you get these by connecting and reversing from your db say via odbc- side by side in two trees with icons to give you clues as to whats different and how to reconcile.

Try 45-day eval from there website.

Monday, September 29, 2003

Given that your databases are Oracle, go to the Quest web site ( and download a copy of Toad.

The "Compare Schema" option on the DBA menu gives you the ability to compare and highlight the differences between two databases, and will automatically generate a correlation script if you wish.

It also has export and import capabilities which will help even if you decide to process the upgrades manually.

Monday, September 29, 2003

Canuck which tool do you use?

Tuesday, September 30, 2003

As I mentioned, we rolled our own.  We found a couple problems with the tools that just compared schemas:

[1]  Just because there is a differenence between dev and production databases, that doesn't necessarily tell you which change to promote.

[2] No tool we tried successfully managed dependancies among stored procs (sp_A uses sp_B) or tables (for relationships).

[3]  For a while we tried having one person manage the promotion process.  It was futile having one person trying to figure out what changes to keep.

So, we adopted some policies for the developers and then wrote a tool to enfore these policies.  So here were the rules:

1.  Each developer is completly responsible for their own modules (code + db).
2.  They have free reign on the dev database.  Do what you want.
3.  You are responsible for determining which db changes need to be promoted to production.
4.  If you break the production server, you don't go home till its fixed.  No execeptions.

So, our solution was to create a tool that allowed a developer to check in a 'script component'.  Basically, you copied the SQL statement(s), required to update the production server.  They could be modified or locked, depending on wether they had been released to production.  Finally, the tool would create a 'release'.  A script comprised of all the script components.

We found that shifting the responsibility to each individual, a team effort if you will, was the most beneficial process change we made.

Tuesday, September 30, 2003

I had a different set of goals compared with canuck:

- developers/customers have to know nothing about the conversion process
- conversion must be re-entrant. If any part of the script fails for some reason it will log the error and continue, allowing the DBA to fix the potential conflict afterwards. A second run will not render the DB inconsistent.
- conversion must not drop RIs if not required to (this saves a lot of time on huge tables)
- conversion must know to iterate tables by following RIs dependencies
- synchronization between ORACLE and MSSQL DBs (and possible other backends)
- conversion engine must optimize the DDL scripts
- populate tables with default data
- must bring any application version released in the past to the last version without extra handling.

This is the procedure we follow:
- test changes on our local databases
- when all problems are solved move the changes to a master DB. This will correspond with a code check-in
- the nightly build reads the master DB and generates a XML schema (which in the fact is the DB state at that point in time)
- either the installation (on QA, production servers) or the developer (on local DBs) triggers a conversion
- the conversion runs a diff using the XML schema and the target DB schema and uses the internal rules (RI dependencies, optimizations, etc) to generate the optimized DDL

The process is fully automated with the exception of updating the master DB, maybe because I don’t trust developers doing menial jobs.

19th floor
Tuesday, September 30, 2003

*  Recent Topics

*  Fog Creek Home