It is not everyday that a blog boasts of 1000th post. After blogging for 1999 days ie 5 years 5 months 24 days, finally, this is my thousandth post.
While this is not a real mile stone or anything, I want to use this post to say thanks.
Thanks to this blog, I have found passion for working with data.
Thanks to this blog, I am able to share my passion with all of you.
Thanks to this blog, I learn new things almost every day.
Thanks to this blog, I found such lovely audience.
Thanks to this blog, I got so many new friends and mentors.
Thanks to this blog, I am no longer worried about my finances.
Thanks to this blog, I have improved my writing skills.
Thanks to this blog, I have become a better individual.
Thanks to this blog, I wake up with a smile everyday.
Thanks to this blog, I feel connected, compassionate, creative and content.
Thank you PHD.
The next 1000 posts
It may as well take another 5 years before I reach the 2000 post mile stone. But here is what I have in mind for PHD’s future.
- Make more users awesome in Excel and Charting. That means, more tricks, hacks, formulas, charting tutorials.
- Encourage guest posts – If there is one thing I am sure about what I know it is “I know very little”. The blog certainly helps me learn new things everyday, but there is so much to learn and there is so little time. Guest posts are a great way to pick up new ideas and new techniques. You will see a lot more of them in future.
- Have regular polls and contests – I am hoping to launch a poll or contest every month. It is no easy task, but I find the community interaction really good on these types of posts.
- Write few more “series” posts – may be about financial modeling, small business management, better charting etc.
- Start a weekly news letter – Our community has grown tremendously in the last 2 years. Keeping in touch with our members has become a difficult task. A news letter (sent out once every week or two) can be an easy way to send updates and share ideas.
- Improve the forums – The PHD forums have amazing number of discussions on regular basis. My aim is increase the activity on forums by encourage more members to sign up and discuss.
- Have More Videos and Screencasts – It is far more easy to learn by looking someone do it. Thus you will see more videos and screencast based tutorials in future.
- Launch few more ebooks and products – I find that having my own products is the best and most effective way to monetize this blog. Over time I am hoping to get rid of the ads on the blog and rely only on revenues from my products.
- Review and recommend quality excel and charting products – There are several high-quality excel products – from folks like Peltier, Jorge, Stephen Few, Charley Kyd etc. and we all could benefit much by understanding how these products work and how they can help us do more in less time.
- Start Online Excel Training Classes – I have been making the videos and material for offering online excel classes. The course should start sometime next year. It should be exciting to see how this takes off.
It is not post until you get an excel tip. As usual we eat and drink excel tips. So here is one to make this a really useful post.

Format a number to be shown in thousands
To format a number in a cell to be displayed in “thousands”,
- Select the cell, Hit CTRL+1 (or go to “format cells”)
- From the “number” tab, go to “custom” format to specify a custom number formatting code
- Specify the code as
#,##0, " thousands" - Bonus – use
#,##0,, " millions"to show numbers in millions - That is all.
Thank you once again. Without you, this mile stone means nothing to me. 🙂















8 Responses to “Pivot Tables from large data-sets – 5 examples”
Do you have links to any sites that can provide free, large, test data sets. Both large in diversity and large in total number of rows.
Good question Ron. I suggest checking out kaggle.com, data.world or create your own with randbetween(). You can also get a complex business data-set from Microsoft Power BI website. It is contoso retail data.
Hi Chandoo,
I work with large data sets all the time (80-200MB files with 100Ks of rows and 20-40 columns) and I've taken a few steps to reduce the size (20-60MB) so they can better shared and work more quickly. These steps include: creating custom calculations in the pivot instead of having additional data columns, deleting the data tab and saving as an xlsb. I've even tried indexmatch instead of vlookup--although I'm not sure that saved much. Are there any other tricks to further reduce the file size? thanks, Steve
Hi Steve,
Good tips on how to reduce the file size and / or process time. Another thing I would definitely try is to use Data Model to load the data rather than keep it in the file. You would be,
1. connect to source data file thru Power Query
2. filter away any columns / rows that are not needed
3. load the data to model
4. make pivots from it
This would reduce the file size while providing all the answers you need.
Give it a try. See this video for some help - https://www.youtube.com/watch?v=5u7bpysO3FQ
Normally when Excel processes data it utilizes all four cores on a processor. Is it true that Excel reduces to only using two cores When calculating tables? Same issue if there were two cores present, it would reduce to one in a table?
I ask because, I have personally noticed when i use tables the data is much slower than if I would have filtered it. I like tables for obvious reasons when working with datasets. Is this true.
John:
I don't know if it is true that Excel Table processing only uses 2 threads/cores, but it is entirely possible. The program has to be enabled to handle multiple parallel threads. Excel Lists/Tables were added long ago, at a time when 2 processes was a reasonable upper limit. And, it could be that there simply is no way to program table processing to use more than 2 threads at a time...
When I've got a large data set, I will set my Excel priority to High thru Task Manager to allow it to use more available processing. Never use RealTime priority or you're completely locked up until Excel finishes.
That is a good tip Jen...