开发者

Should we have separate database instance for each developer?

What is the best way for developing a database based application? We can have two approaches.

  1. One common database for all the developers.
  2. Separate database for all the developers.

What are the pros and cons of each? And which one is better way?

Edit: More then one developer is supposed to update the database and we already have SqlExpress 2005 on each developer machine.

Edit: Most of us are suggesting a common database. However if one of the dev has modified the code and dat开发者_开发知识库abase schema . He has not committed the code changes but the schema changes has gone to the common database. Will it not possibly break the other developers code.


Both -

I like a single database that changes are tested on before going live, or going to a 'formal' test environment. This is your developer's sanity check; it stays up to date with the live system and it makes sure they always consider each others changes. The rule should be that changes don't go on here if they might break something else.

A database per developer is great (even essential) when more than one developer is making updates. It allows them all the development flexibility they want without breaking things for other developers.

The key is to have a process for moving database changes from development through to your live system, and stick to your process.


Shared database

  • Simpler
  • Less cases of "It works on my machine".
  • Forces integration
  • Issues are found quickly (fail fast)

Individual databases

  • Never affect other developers, but this is also a bad thing, in continuous integration

We use a shared development database and it works out nicely. Our schema rarely changes in a way that makes it backwards incompatible, but occasionally a design change will occur before we go live, and we simply ask the other developers to update.

We do have separate development application (web) servers, but they share the same database. Our developers do have the option to use their own database, as they know how to set this up, and will do that on occasion, but only temporarily. The norm, for us, is to share the database.


Thought I'd throw this out there, but why not let every developer host their own instance of SQL Server Developer on their desktops and then have a shared server for each of the other environments (development, QA, and prod)? I think even the basic MSDN that comes with Visual Studio Pro (if you opt for it) includes a license for SQL Server Developer.

The developer can work on their desktop without impacting the others and then you can have them move the code to the next shared environment as you see fit (at will, with daily/weekly builds, etc.).

EDIT: I should add that the desktop instance allows developers to do things that he DBAs often restrict on shared environments. This includes database creation, backup/restore, profiler, etc.. These things are not essential but they allow the developer to become so much more productive while reducing the demands they make against your DBAs.

The shared environment is completely necessary for testing - I would not recommend going from desktop to production. But you can add so much by allowing the developers to have 100% control over a given database environment (including isolation from others) with a relatively minor cost.


Depends on your development, testing and maintenance cycles. Also on the size and location of the development team (and of course organization). If you support several versions of the database you might need even more environments.

In real world I found the following approach rather satisfying:

  • single central database/application for testing purposes, gets all the changes by various developers periodically merged into it
  • local copies for development (so you are free to drop and reload the whole database)
  • upgrade scripts are maintained for any changes to schema, auxiliary and sample data sets

Here are some further points:

If two developers (two teams) are working on changes that can affect each other then they should complete their tasks independently and then integrate/merge and test. For this it is much better to have separate development environments (unless they have to work together in which case I consider them to be a part of the same team; still they can work on their own copies of the database and share it if necessary)

If they work on the changes that do not influence each other they could work on the main server. Or on their own local copies of the database.

So, developing on the local copy has all the benefits with no risk in a general case (when you support multiple versions of the system and maintain upgrade scripts anyway).

Still it is great if you can share test cases so ability to dump/restore the database easily and quickly is a big plus.

EDIT:
All of the above assume that having a copy on the local machine of the whole system for testing purposes is feasible (size, performance, licenses, etc).


I would opt for solution #1 : One common database for all the developers.

Pros

  1. Less expensive for the infrastructure;
  2. Only one dump is required when it's time to refresh the development database;
  3. Everyone develops with the same data, so it closely represents the production environment;

Cons

  1. If one developer performs a bad operation, this could impact a larger amount of developers.

As for solution #2 : One independant database for each of the developers;

Pros

  1. This could be useful for new features developments, when development requires isolation;

Cons

  1. More expensive for the company (infrastructure, licences...);
  2. Multiplication of problems caused by eager isolation development environment (works in devloper's environement, not integrated);
  3. Multiplication of dumps by the DBAs of the same copy from the production environment.

Considering the above, I would recommend, depending on your company size:

  1. One database for development;
  2. One database for testing the integration;
  3. One database for acceptance tests;
  4. One for new feature development that will perhaps require integration tests.

If your company doesn't require integration tests, then go with acceptance tests, this step is crucial before going to production.


One per developer plus a continuous integration and build server to run unit and integration tests. That gives you the best of both worlds.

Having all developers modify a single dev database quickly becomes less productive once the amount of database change reaches a certain level because it forces a developer to deploy changes to the shared database before he is ready to check-in, which means other parts of the code line may break unnecessarily.


Simple answer:

Have one development database, and if the developers want their own, they can just run their own instance on their own machines. Just be sure to test/publish on the shared.


We do both:

We use code generation where I'm at and our database is generated as well. So we have an instance on each developer's box where the database is generated. Then we use the scripts that are generated to apply the changes to a central test database. If that goes well we apply the changes to the production database during a release.

What's nice with this approach is that when our "source of truth" is checked in to source control, all the database changes are automatically distributed to the other developers when they rebase and regenerate. It works well for us.


The best way is single database on Test/QA server and one database (probably on developer's local computer) for each developer (so, 10 developers work with 10 + 1 databases).

The same approach as for general development: each developer has own copy of source code on local machine.

Also, multiple-database approach simplifies the keeping database schema in version control systems. We are keeping database creation scripts in SVN.

We are using the approach, described here: http://www.sqlaccessories.com/Howto/Version_Control.aspx


You might also want to look at Refactoring Databases. Aside from discussing database changes, he includes discussions on going from development to production in a way that reduces risk.


Why on earth would you want a separate database for all developers? Have one common database for all, that way the table structure is consistent and the sql statements are as well.


The biggest problems with developers having their own databases are:

  • First it is unlikely to be the size of the real production database (if you take all the databases we need to work with here, they would take up several hundred gigabytes of space, I don't have that available on my machine), this causes bad code to be written that will never work on a large database for performance reasons. SQL code should never be written against a data set significantly smaller than the one on prod.
  • Second, developers who use their own database create problems when they spend a long time developing something and then find out only after they merge with a real datbase that it affects something else. You find this stuff much faster when you share the environment. So there is inthe end less wasted development time.
  • Third developers working on related things need to know about the changes you are making, it will affect their change.

When you know you are going to affect others, I think you tend to be more careful what you do which isa plus in my book.

Now the shared database server should have what we call a scratch database, a place where people can create and test table changes, so if they are doing something that might need to drop and recreate a table (which should be a rare case!), they can test the process first by copying the table to the scratch database and running their process there and then changin to the real database when they are sure it works. Or we often copy a backup table to the scratch database before testing a particular change, so we can easily recreate the old data if it goes bad.

I see no advantages at all to using individual databases.

0

上一篇:

下一篇:

精彩评论

暂无评论...
验证码 换一张
取 消

最新问答

问答排行榜