Building the Data Warehouse

Скачать в pdf «Building the Data Warehouse»
Normalization/Denormalization


The output of the data model process is a series of tables, each of which contains keys and attributes. The normal output produces numerous table, each









Figure 3.23 The data model allows the different iterations of development to be built in a cohesive manner.


Figure 3.25 When there is no data model, the iterations do not form a cohesive pattern. There is much overlap and lack of uniformity.


with only a modicum of data. While there is nothing wrong—per se—with lots of little tables, there is a problem from a performance perspective. Consider the work the program has to do in order to interconnect the tables dynamically, as shown in Figure 3.26.


In Figure 3.26, a program goes into execution. First, one table is accessed, then another. To execute successfully, the program must jump around many tables. Each time the program jumps from one table to the next, I/O is consumed, in terms of both accessing the data and accessing the index to find the data. If only one or two programs had to pay the price of I/O, there would be no problem. But when all programs must pay a stiff price for I/O, performance in general suffers, and that is precisely what happens when many small tables, each containing a limited amount of data, are created as a physical design.

Скачать в pdf «Building the Data Warehouse»