I have financial data coming in from one of our in-house systems in a flat tabular form with a lot of repeating values.
For each year and each snapshot there are around 800K rows of data. If I want to compare 2 years and maybe compare forecast to actuals etc then I could possibly have 2-3 million rows or data causing file size to go to 100-150MB.
Loading data into a Power Pivot data model seems to be reducing the size a fair bit but I was wondering if it could be even better if the data was in a normalized form?
If yes (or even if not), is there way to convert a flat table into a normalized relational DB easily?
For each year and each snapshot there are around 800K rows of data. If I want to compare 2 years and maybe compare forecast to actuals etc then I could possibly have 2-3 million rows or data causing file size to go to 100-150MB.
Loading data into a Power Pivot data model seems to be reducing the size a fair bit but I was wondering if it could be even better if the data was in a normalized form?
If yes (or even if not), is there way to convert a flat table into a normalized relational DB easily?