In How Many Links are Too Many Links, O’Reilly radar shows us this unfortunate bubble chart. (click on the image to see a bigger version)
I say unfortunate for the lack of a better word without sounding harsh.
Just in case you are wondering what that chart is trying to tell (which is perfectly fine)
Nick Bilton, who constructed this chart, got curious and went to the top 98 websites in the world and found out how many links they have on their home page. Then he used charting tools like processing to create the bathing bubbles you are seeing aside.
The conclusion ?
Too many bubbles can drown you. And also, top web sites have lots and lots of links on their home pages.
But seriously, apart from looking really pretty, does this chart actually provide that conclusion?
I think Nick and the O’Reilly radar team could have much better with a simpler and fortunate chart selection.
A histogram of # of links on popular home pages
like the one below would have been very easy to read and get the point.

I showed some dummy data in the histogram, but when you create 2 histograms, one for popular sites (ranked below 5000) and one for not-so-popular sites (>5000) you can easily make the point and use the bubbles for a warm bath.
A better alternative is to show a scatter chart
with site rank on one axis and # of links on home page on another axis, that way a conclusion like Top Sites Links More can be easily established.

Even a bar chart with number of links on each home page
could have been better than umpteen bubbles

You could easily add a bar with “avg. number of links on non-popular sites” to contrast the linking behaviour of large sites wrt small sites.
But alas, we are treated to an unfortunate bubble chart that does nothing but look pretty (and ridiculously large)
What do you think ? How many bubbles are too many ?
Recommended Reading on Bubble Charts: Travel Site Search Patterns in Bubbles, Good Bubble Chart about the Bust. Olympic Medals per Country















8 Responses to “Pivot Tables from large data-sets – 5 examples”
Do you have links to any sites that can provide free, large, test data sets. Both large in diversity and large in total number of rows.
Good question Ron. I suggest checking out kaggle.com, data.world or create your own with randbetween(). You can also get a complex business data-set from Microsoft Power BI website. It is contoso retail data.
Hi Chandoo,
I work with large data sets all the time (80-200MB files with 100Ks of rows and 20-40 columns) and I've taken a few steps to reduce the size (20-60MB) so they can better shared and work more quickly. These steps include: creating custom calculations in the pivot instead of having additional data columns, deleting the data tab and saving as an xlsb. I've even tried indexmatch instead of vlookup--although I'm not sure that saved much. Are there any other tricks to further reduce the file size? thanks, Steve
Hi Steve,
Good tips on how to reduce the file size and / or process time. Another thing I would definitely try is to use Data Model to load the data rather than keep it in the file. You would be,
1. connect to source data file thru Power Query
2. filter away any columns / rows that are not needed
3. load the data to model
4. make pivots from it
This would reduce the file size while providing all the answers you need.
Give it a try. See this video for some help - https://www.youtube.com/watch?v=5u7bpysO3FQ
Normally when Excel processes data it utilizes all four cores on a processor. Is it true that Excel reduces to only using two cores When calculating tables? Same issue if there were two cores present, it would reduce to one in a table?
I ask because, I have personally noticed when i use tables the data is much slower than if I would have filtered it. I like tables for obvious reasons when working with datasets. Is this true.
John:
I don't know if it is true that Excel Table processing only uses 2 threads/cores, but it is entirely possible. The program has to be enabled to handle multiple parallel threads. Excel Lists/Tables were added long ago, at a time when 2 processes was a reasonable upper limit. And, it could be that there simply is no way to program table processing to use more than 2 threads at a time...
When I've got a large data set, I will set my Excel priority to High thru Task Manager to allow it to use more available processing. Never use RealTime priority or you're completely locked up until Excel finishes.
That is a good tip Jen...