I've seen many projects where the database schema is way over engineered up front because people are afraid that the schema will be set in stone once they reach production. Of course, something is always missed anyway and lots of columns and tables exist in the schema that are never used. This leads to awful things like adding a second copy of table that is modified and leaving the original table in place but not used (or still used - who can tell without wasting time doing deep analysis?).
We've gotten around this to a good extent by having each developer point at a local database and then have the automated build process (CruiseControl) point at an integration database. The build process drops and re-creates the database for each build.
When moving to a higher environment, DBAs need to get involved. They version control scripts that control each modification to the database. This step is a little more painful because the DBAs will cannot just drop the schema and start from scratch.
Do you have any suggestions for a better solution?
The soul is dyed the color of its thoughts. Think only on those things that are in line with your principles and can bear the light of day. The content of your character is your choice. Day by day, what you do is who you become. Your integrity is your destiny - it is the light that guides your way. - Heraclitus