An old DBA expression, Normalize till it hurts, denormalize until it works. In a large corporate environment with 187 tables normalized to the average 2nd level, I had the DBA's experiment...they were only able to normalize to the 4th normalization...they really couldn't understand the 5th. The number of tables grew to 652 and not one sql statement could be converted due to the complexity of the 4th normalized state. At the third, we had 315 tables, and more than 30% of our sql statements were too complex. The DBA's rewrote one of the applications, and the process time trippled to perform the same report. Normalization is great, but is often not practical. I fully agree on normalizing to the highest practical level, but sometimes that is level is the first, or second. Most of my projects landed somewhere just above the 2nd. The same can be said for Indexing fields in the tables. When I was with IBM, we advocated using indexes, especially with DB2 when it was first released. Well, the sales jumped and then died as clients found process time was 20 to 50 times that of IMS databases. It wasn't until the impact of many index's became apparent, that DB2/Sql started to receive performance improvements (better index handling,storage, etc) but to this day, the greater the number of indexed fields, the slower your performance. Not to say indexes are bad...they are great and increase performance for functions using the index. They require thought and planning, as does normalization.
Add on: Data integrity .... nasty thing ... normalized example... If I have a product called a widget selling for $1 and some clerk decides that widget is not worth selling and changes it to a wonker selling for $100 mid year...normalized, every widget sold becomes a wonker and if we sold 10 before and 10 after the change and if the price is calcualted (another normalization practice) it would report 20 wonkers sold for $2000 when we sold 10 wonkers for $1000 and 10 widgets for $10....The exec's would be all over you for the missing $990. In other words data integrity failed due to normalization. In a flat file that would not happen. But in a flat file, if we wanted to change the name of widget to wonker and missed one, we would report we had one widget sold which is wrong...arguments can be made but in reality, data integrity is a major component of the application and its rules, audit trails, etc. Normalization when done correctly, reduces the likelihood of errors, and therefore enhances data integrity. Normalization does not provide data integrity.