Denormalized tables within OLTP database

database-designdenormalization

My company has a product centered around a fairly normalized mssql server database. At this point we have several reports and a web application that needs the data in a much flatter fashion to display the data appropriately to the user. Today we are using views/stored procedures to provide flattened data model needed by the reports and web app.

Currently there are performance problems though with several of the views due to the amount of aggregate data and joins being performed inside the views. My thought was to instead provide denormalized tables that could be optimized for performance for both the reports and the web application. These denormalized tables would be built at some interval of time or maintained as updates happened to the underlying oltp tables.

Is this a common thing to do? Am I asking for trouble as far as performance just keeping these tables up to date? What mechanisms are the best for building these sort of tables? Is there a better way to do this than these denormalized tables?

Best Answer

What you are thinking about is relatively common and pre-processing the data can indeed help the performance of reports and web pages. The 'cost' to this is that the pre-processing takes time, so your denormalized tables (you can think of this as a mini-OLAP Data Warehouse) are never fully up-to-date.

I do not consider that to be a big problem, but just one that your users need to understand. (E.g. - "I just finished the xyz transaction and it is not in my report.")

Any reports that need the latest transaction will need to include at least some elements from the live data, instead of the pre-processed "OLAP" data.

Here is a generic link explaining the relationship. http://datawarehouse4u.info/OLTP-vs-OLAP.html

The web page also explains some nuances of the OLTP/OLAP divide, but there is nothing really surprising in the description.