If you ever had to simulate random outcomes in excel sheet, you might have already heard of about the spreadsheet function
rand()
, this little function generates a random fraction between 0 and 1 whenever you use it. So I usually write =
if I need a random number between 0 to 12. Of course, if you have analysis tool pack installed like I do, then we can use round(rand()*12,0)int(rand()*12)+1randbetween(0,12)
to do the same.
In order to simulate a dice throw, thus you can use
. round(rand()*5,0)int(rand()*6)+1
So, what would you do if you need to simulate the face total when you throw 2 dice?
?round(rand()*10,0)int(rand()*11)+2
Wrong
Why? Apparently a random number between 2 and 12 (1 is not possible as the minimum you can get when you throw two dice is 2) doesn’t simulate 2 dice throws properly.
The CORRECT way to do this is instead generate 2 individual random numbers and add them up, like:
round(rand()*5,0)int(rand()*6) + 1 + round(rand()*5,0)int(rand()*6) + 1
Here is why this is correct way to simulate dice throws using random number generator functions:
I have ran these 2 random functions each for 2500 times and plotted the distribution:
As you can see, the left plot of int(rand()*11)+2
tells that each of the 11 possibilities (2,3,4,5,6,7,8,9,10,11) are equally likely. But that is not what happens when you throw a dice, you see an awful lot more 5,6,7,8 than you see a perfect 12 or 2. And there is a reason for that, the distribution of 2 dice throws is actually a bell curve, and when you use int(rand()*6) + int(rand()*6) + 2
the distribution is bell curvish.
Update: Thanks to Jon for pointing out that round() is not the choice if you want random integers, you should use int instead. See his explanation in the comments and the illustration here.
I have used this logic to simulate monopoly board game and prove that it is not really that random.
More on games: Bingo / Housie ticket generator excel sheet
8 Responses to “Pivot Tables from large data-sets – 5 examples”
Do you have links to any sites that can provide free, large, test data sets. Both large in diversity and large in total number of rows.
Good question Ron. I suggest checking out kaggle.com, data.world or create your own with randbetween(). You can also get a complex business data-set from Microsoft Power BI website. It is contoso retail data.
Hi Chandoo,
I work with large data sets all the time (80-200MB files with 100Ks of rows and 20-40 columns) and I've taken a few steps to reduce the size (20-60MB) so they can better shared and work more quickly. These steps include: creating custom calculations in the pivot instead of having additional data columns, deleting the data tab and saving as an xlsb. I've even tried indexmatch instead of vlookup--although I'm not sure that saved much. Are there any other tricks to further reduce the file size? thanks, Steve
Hi Steve,
Good tips on how to reduce the file size and / or process time. Another thing I would definitely try is to use Data Model to load the data rather than keep it in the file. You would be,
1. connect to source data file thru Power Query
2. filter away any columns / rows that are not needed
3. load the data to model
4. make pivots from it
This would reduce the file size while providing all the answers you need.
Give it a try. See this video for some help - https://www.youtube.com/watch?v=5u7bpysO3FQ
Normally when Excel processes data it utilizes all four cores on a processor. Is it true that Excel reduces to only using two cores When calculating tables? Same issue if there were two cores present, it would reduce to one in a table?
I ask because, I have personally noticed when i use tables the data is much slower than if I would have filtered it. I like tables for obvious reasons when working with datasets. Is this true.
John:
I don't know if it is true that Excel Table processing only uses 2 threads/cores, but it is entirely possible. The program has to be enabled to handle multiple parallel threads. Excel Lists/Tables were added long ago, at a time when 2 processes was a reasonable upper limit. And, it could be that there simply is no way to program table processing to use more than 2 threads at a time...
When I've got a large data set, I will set my Excel priority to High thru Task Manager to allow it to use more available processing. Never use RealTime priority or you're completely locked up until Excel finishes.
That is a good tip Jen...