mike.magill
New Member
I have two csv files; one is c.4Gb (19 million rows and 29 fields per row) and the other is c.9Gb (29 million rows and 20 fields per row). I have tried unsuccessfully to get the client to store this data in SQL server so I have to use the data in this format for Power Query analysis.
I have Power BI Desktop with Dax Studio and know that I can theoretically transform these files and export as new smaller csv files (e.g. removing unneeded fields) but my computer just can't seem to cope with the file sizes. I have an i7 processor and 16Gb of ram.
If I upgrade my laptop to a newer processor and, say, 64Gb is that going to help? What other things can I try? I'm currently seeing if increasing the 'maximum memory user per simultaneous evaluation' helps.
I have Power BI Desktop with Dax Studio and know that I can theoretically transform these files and export as new smaller csv files (e.g. removing unneeded fields) but my computer just can't seem to cope with the file sizes. I have an i7 processor and 16Gb of ram.
If I upgrade my laptop to a newer processor and, say, 64Gb is that going to help? What other things can I try? I'm currently seeing if increasing the 'maximum memory user per simultaneous evaluation' helps.