RainLover
VIP From a land downunder
- Local time
- Today, 17:30
- Joined
- Jan 5, 2009
- Messages
- 5,041
A lazy programmer is one who is satisfied with something that works. They only seek advice when something goes wrong. Make improvements only if the user points out a flaw.
A good programmer seeks and takes advice from those who are somewhat experienced in their field. They weigh one opinion against the other then choose the one that bests suits their situation. These methods have been tested and have withstood the rigours of time.
A truly good programmer asks questions. “Why” and “Why Not” are two of their better forms of enquiry. These select groups of programmers are not overly influenced by Titles. They know the worth of their advisers and put aside those who promote themselves as Gurus because of the volume of their community involvement rather than the quality.
We read on the WWW, many things. some true, some false and some ambiguous. A lot of which is limited in scope and simply the thoughts of others rehashed to suit the writers intent.
One such subject is Normalization, which was invented by Edgar Frank Codd. First published in 1970. The world of computers and computing has undergone a lot of improvements since then so it would now be appropriate to say that Normalization has its Advantages and Disadvantages.
Advantages
Avoids data modification (Insert/Delete/Update) anomalies as each data item lives in one place
Normalization is conceptually cleaner and easier to maintain and change as your needs change
Fewer null values and less opportunity for inconsistency
Increased storage efficiency
The normalization process helps maximize the use of clustered indexes, which is the most powerful and useful type of index available. As more data is separated into multiple tables because of normalization, the more clustered indexes become available to help speed up data access
Disadvantages
Requires much more CPU, memory, and I/O to process thus normalized data gives reduced database performance
Requires more joins to get the desired result. A poorly-written query can bring the database down
Maintenance overhead. The higher the level of normalization, the greater the number of tables in the database.
So why should we blindly follow the Rules of Normalization without question. Is it time we challenged the sacred cow.
I welcome your comments.
A good programmer seeks and takes advice from those who are somewhat experienced in their field. They weigh one opinion against the other then choose the one that bests suits their situation. These methods have been tested and have withstood the rigours of time.
A truly good programmer asks questions. “Why” and “Why Not” are two of their better forms of enquiry. These select groups of programmers are not overly influenced by Titles. They know the worth of their advisers and put aside those who promote themselves as Gurus because of the volume of their community involvement rather than the quality.
We read on the WWW, many things. some true, some false and some ambiguous. A lot of which is limited in scope and simply the thoughts of others rehashed to suit the writers intent.
One such subject is Normalization, which was invented by Edgar Frank Codd. First published in 1970. The world of computers and computing has undergone a lot of improvements since then so it would now be appropriate to say that Normalization has its Advantages and Disadvantages.
Advantages
Avoids data modification (Insert/Delete/Update) anomalies as each data item lives in one place
Normalization is conceptually cleaner and easier to maintain and change as your needs change
Fewer null values and less opportunity for inconsistency
Increased storage efficiency
The normalization process helps maximize the use of clustered indexes, which is the most powerful and useful type of index available. As more data is separated into multiple tables because of normalization, the more clustered indexes become available to help speed up data access
Disadvantages
Requires much more CPU, memory, and I/O to process thus normalized data gives reduced database performance
Requires more joins to get the desired result. A poorly-written query can bring the database down
Maintenance overhead. The higher the level of normalization, the greater the number of tables in the database.
So why should we blindly follow the Rules of Normalization without question. Is it time we challenged the sacred cow.
I welcome your comments.
Last edited: