- Databases are almost always the least fault-tolerant layers in an application
- Databases are almost always harder to debug and troubleshoot than other layers in an application.
- Databases are written to read and write from disk efficiently, and apply set operations on groups of data. Branching logic, string manipulation, and the like are convenience tools only.
- Enterprise databases are really expensive and much harder to scale horizontally.
These ideas have been beaten into my head over the past 4 years or so, and they make a lot of sense (or I'm just brainwashed).
The next point is a matter of maintainability. Let's say you've got a stored procedure that's about 150 lines long that contains some conditional logic and is joining across 7 or 8 tables in multiple branches and in various sub-selects. Now, you have to add a whole new set of functionality, but to do so, you have to integrate data filtering that exists across 9 or 10 new tables. Now lets add another real-world variable to the equation...turnover. What happens to the poor bastard that inherits this beast? To me, this is far more costly than any performance loss that is incurred by introducing complex joins and decision logic.
Now don't get me wrong--I'm not anti-normalization. In fact, it should probably be the first thing considered in any data modeling. But normalization-for-the-sake-of-normalization can be a slippery slope in terms of an application's agility and maintainability. Embrace best practices, but question them as well. There are people out there who make a lot of money selling enterprise database software so be careful about whose Kool-Aid you drink.