• Hi All

    Please note that at the Chandoo.org Forums there is Zero Tolerance to Spam

    Post Spam and you Will Be Deleted as a User

    Hui...

  • When starting a new post, to receive a quicker and more targeted answer, Please include a sample file in the initial post.

Spreadsheet taking over 30 mins to recalculate.....

Suzanne Peel

New Member
Good Morning or Afternoon (depending on your hemisphere!)

I have been doing my very best to become "awesome" in excel but I sadly miss the mark.

I have a spreadsheet ~20MB in size. It has become almost impossible for me to do anything with it today (every now and then a miracle happens and it re-calcs really quick).

My most recent issue -

I have selected 6 cells in one row and "double clicked" in the far RH bottom corner to "copy down" now over 30 mins later it is still going at approx 3 rows per second...... I have 5200 rows.

I have searched lots of different methods of fixing this however I am flummoxed!

I have checked that each sheet is the correct size, I have played with swapping between manual and auto-calculations.

I have tried saving, closing the machine down and re-starting.

My sheet comprises a cut-and-paste-special (values) (old pivot table) together with helper columns that I have created as a table.

I have several home made vba functions (in my personal.xlsm) that I use to get financial year and month name etc; so I dont have to keep updating if the data comes over in a different order I use an Hlookup to give me the column no for my vlookups.

(the re-calc is up to 4000 lines....)

eg one formulae is =if(hlookup("Suburb",R4C1:RC60,rows(R4C:RC),0)="",vlookup(hlookup("Suburb",R4C1:RC60,rows(R4C:RC),0),'Post Codes'!C6:C8,3,0))

Hlookup gives me the suburb name and vlookup gives me the postcode or what ever I want.

FYI I have done everything in the chandoo forum suggestions (Fixing bloated file size and slow calculation in Excel ...) and in Debra Dalgesh's contextures and a few other bit and pieces.

I am using Office 2010 32bit software and the data comes from SAP.

I have also had issues getting SAP created CSV files into Power Pivot so I am feeling very despondent. I was hoping to move to power-pivot to manage this but if I can't keep this going just using pivots and formulaes I think I will be in the dog-house!


Now at 4420 lines....

Anyway I have to finish this report before I leave today so any suggestions you offer will be used in next months report.

Thanks for any help you can offer..

Suzy

4790 lines......
 
Hi Suzanee,
Experts around here may give you better advice, here is what i can see:

=if(hlookup("Suburb",R4C1:RC60,rows(R4C:RC),0)="",vlookup(hlookup("Suburb",R4C1:RC60,rows(R4C:RC),0),'Post Codes'!C6:C8,3,0))

I dont understand the R4C1 part, and specially the Red one (R4C:RC).
if it is a typo of ROWS(RC:RC), then you are giving array of all the 1048576 rows, which may causing the problem.

Regards,
 
Hi,

Can we have sample xl file with around 100 rows of dummy data so that here, we would take charge to diagnosis the issue.
 
Hi Suzanne,

Besides above suggestions you can try this but first consider backing up your data.

If you think that your file size should have been smaller, then try this:

1. Open the sheet and press F5, on GO TO Menu, select last cell. this will select the last cell on the sheet.

2. from that cell upward and leftward, delete everything (rows and columns) means everything between your data's last column and last row and the last cell selected (you selected through F5-last cell option.)

Save the file and see if file size reduces.
 
Deepak,
Thankyou I will organise a desensitised file to attach.

Khalid,
Rows(R4C:RC) was a trick I learnt in Chandoo's Excel course R4C:RC selects the number of rows down from row 4 in a particular column. Maybe I have misunderstood - the formula works well - it just takes forever...

Faseeh,
I have already done this but I will try again using the GoTo last Cell. I used Ctl End so maybe there is some difference.

Thanks all for you help!
 
Hi Suzanne ,

1. Is your file size of 20 MB reasonable in your opinion ? What was it when you first created it , and how does it increase in size as you go along ?

2. The most fundamental way to reduce recalculation time is to use helper cells / rows / columns , which can be used to store intermediate results ; when there is a recalc. , a smaller or less compute-intensive formula is recalculated instead of a bigger , more compute-intensive one.

3. Most times , the problem is not just with the formulae ; even the basic layout of the workbook can be the problem ; is the formula chain properly done ? A properly designed workbook structure and formula will ensure that as far as possible , all formulae in a particular tab refer to cells in that tab. Having formulae refer to cells all over the workbook means the dependency chains are longer and more complex , which leads to increased recalc. times.

4. In the interest of efficiency , you need to decide whether the formulae in the workbook are being used appropriately ; for instance , a formula-based solution to derive unique entries from a list with duplicates is bound to create problems once the list grows to thousands of entries ; in such cases , a pivot table is a more appropriate solution. You can easily have a line of code to refresh the pivot table when ever the input data changes. A similar argument will apply to other common situations such as sorting.

Narayan
 
Further to what Narayn suggests, volatile formulas could be the culprit. See my blog post at http://chandoo.org/wp/2014/03/03/handle-volatile-functions-like-they-are-dynamite/

But I suspect the problem is those UDFS. Mixing home-made UDFs and tables can be a recipe for disaster, as it turns out that changing a tables dimensions can trigger all UDFS to fire. As can deleting cells. Check out my posts on this at http://stackoverflow.com/questions/...fs-recalculating-when-unrelated-cells-deleted and my comment at https://fastexcel.wordpress.com/201...unctions-and-function-arguments/#comment-2955

You need to post more here about what your UDFs do - including the code - so that we can give you formula equivalents. Then your problem will probably go away.

If you’re interested in what makes for an efficient UDF, then you can’t miss Excel MVP Charles Williams’s amazing—but advanced—series of blog posts on putting together UDFs. You can find them here:
[url]http://fastexcel.wordpress.com/2011/05/25/writing-efficient-vba-udfs-part-1/
http://fastexcel.wordpress.com/2011/06/06/writing-efficient-vba-udfs-part-2/
http://fastexcel.wordpress.com/2011...vba-udfs-part-3-avoiding-the-vbe-refresh-bug/
http://fastexcel.wordpress.com/2011...rences-arrays-calculated-expressions-scalars/
http://fastexcel.wordpress.com/2011...-vba-udfs-part5-udf-array-formulas-go-faster/
http://fastexcel.wordpress.com/2011...a-udfs-part-6-faster-strings-and-byte-arrays/
http://fastexcel.wordpress.com/2011...a-udfs-part-7-udfs-calculated-multiple-times/
http://fastexcel.wordpress.com/2012...usly-calculated-value-from-the-calling-cells/
http://fastexcel.wordpress.com/2012/01/31/writing-efficient-udfs-part-9-an-example/
http://fastexcel.wordpress.com/2012...10-volatile-functions-and-function-arguments/
http://fastexcel.wordpress.com/2012...column-references-in-udfs-used-range-is-slow/[/URL]
 
Thank you everyone for your assistance.

After just trying to delete 6 lines took an hour I decided to cut my loses for the moment.

I saved the file as a binary file (I have no idea what this means) and I was able to finalise the report using cut and pastes as well as some other bits and pieces.

I will be taking all this home on the weekend and attempting to create a sanitised version of the spreadsheet and review your ideas.

I apologise for not being able to get back to you all straight away but I will be doing so this weekend.

Suzy
 
FYI - When deleting items/rows. You can set calculation to manual and stop Excel from trying to recalculate all volatile functions.

Your data size indicates, likely culprit as volatile function and UDF as others have stated.

For an example, 450k row, 10 column data with 3 added calculation columns with summary sheet using SUMIFS, INDEX, MAX, MIN, calculations usually takes me about few minutes to open (as it has semi-volatile functions), but takes few seconds to update once opened.
 
FYI - When deleting items/rows. You can set calculation to manual and stop Excel from trying to recalculate all volatile functions.

Anything volatile would recalculate the moment you switch back to auto again, so this would only make a difference if you were going to do multiple things while calc is set to manual.
 
Back
Top