I have a large excel model that has data on 3 tabs - roughly 35 cols x 3K rows for each table and a series of reports on another 90 tabs that reference the table data using vlookup formulas and perform various calculations including manual updates to the values returned from the tables.
The file is approx 10mb in xlsb format and takes about 5min to open - probably due to recalculation. I'd like to reduce both the time it takes to open the file and the size of the file.
What is the most efficient way to structure (currently unsorted export from another system) and reference the data?
I've tested a few options using a subset (1 report and the 3 data tabs) but haven't seen much of a difference; all of the subset files were approx 730k:
1) vlookup
2) index / match
3) intersect(replacing vlookup formulas with named ranges for the tables and intersection of ranges to return values)
4) dget (not scalable as each criteria needs a separate range)
The file is approx 10mb in xlsb format and takes about 5min to open - probably due to recalculation. I'd like to reduce both the time it takes to open the file and the size of the file.
What is the most efficient way to structure (currently unsorted export from another system) and reference the data?
I've tested a few options using a subset (1 report and the 3 data tabs) but haven't seen much of a difference; all of the subset files were approx 730k:
1) vlookup
2) index / match
3) intersect(replacing vlookup formulas with named ranges for the tables and intersection of ranges to return values)
4) dget (not scalable as each criteria needs a separate range)