This post builds on earlier discussion, How many hours did Johnny work? I recommend you to read that post too.
Lets say you have 2 dates (with time) in cells A1 and A2 indicating starting and ending timestamps of an activity. And you want to calculate how many workings hours the task took. Further, lets assume,
- Start date is in A1 and End date is in A2
- Work day starts at 9 AM and ends at 6PM
- and weekends are holidays
Now, if you were to calculate total number of working hours between 2 given dates, the first step would be to understand the problem thru, lets say a diagram like this:

We would write a formula like this:
=(18/24-MOD(A1,1)+MOD(A2,1)-9/24)*24 + (NETWORKDAYS(A1,A2)-2)*9
See the above illustration to understand this formula.
Now, while this formula is not terribly long or ineffective, it does feel complicated.
May be we can solve the problem in a different way?!?
Michael left an interesting answer to my initial question, how many hours did Johnny work?
Pedro took the formula further with his comment.
The approach behind their formulas is simple and truly out of box.
Instead of calculating how many hours are worked, we try to calculate how many hours are not worked and then subtract this from the total working hours. Simple!
See this illustration:

So the formula becomes:
Total working hours between 2 dates – (hours not worked on starting day + hours not worked on ending day)
=NETWORKDAYS(A1,A2)*9 - (MOD(A1,1)-9/24 + 18/24 -MOD(A2,1))*24
After simplification, the formula becomes,
=NETWORKDAYS(A1,A2)*9 - (MOD(A1,1) -MOD(A2,1))*24 -9
=(NETWORKDAYS(A1,A2)-1)*9 +(MOD(A2,1)-MOD(A1,1))*24
Sixseven also posted an equally elegant formula that uses TIME function instead of MOD()
=(NETWORKDAYS(B3,C3)*9) - ((TIME(HOUR(B3),MINUTE(B3),SECOND(B3))-TIME(9,0,0))*24) - ((TIME(18,0,0)-TIME(HOUR(C3),MINUTE(C3),SECOND(C3)))*24)
Download the solution Workbook and play with it
Click here to download the solution workbook and use it to understand the formulas better.
Thanks to Pedro & Michael & Sixseven & All of you
If someone asks me what is the most valuable part of this site, I would proudly say, “the comments”. Every day, we get tens of insightful comments from around the world teaching us various important techniques, tricks and ideas.
Case in point: the comments by Michael, Pedro and Sixseven on the “how many hours…” post taught me how to think out of box to solve a tricky problem like this with an elegant, simple formula. Thank you very much Michael, Pedro, Sixseven and each and every one of you who comment. 🙂
Have a great weekend everyone.
PS: This weekend is my mom’s birthday, plus it is a minor festival in India. So I am going to eat sumptuously, party vigorously and relax carelessly. Next week is going to be big with launch of excel school 3.
PPS: While at it, you may want to sign up for excel school already. The free lesson offer will vanish on Wednesday.















8 Responses to “Pivot Tables from large data-sets – 5 examples”
Do you have links to any sites that can provide free, large, test data sets. Both large in diversity and large in total number of rows.
Good question Ron. I suggest checking out kaggle.com, data.world or create your own with randbetween(). You can also get a complex business data-set from Microsoft Power BI website. It is contoso retail data.
Hi Chandoo,
I work with large data sets all the time (80-200MB files with 100Ks of rows and 20-40 columns) and I've taken a few steps to reduce the size (20-60MB) so they can better shared and work more quickly. These steps include: creating custom calculations in the pivot instead of having additional data columns, deleting the data tab and saving as an xlsb. I've even tried indexmatch instead of vlookup--although I'm not sure that saved much. Are there any other tricks to further reduce the file size? thanks, Steve
Hi Steve,
Good tips on how to reduce the file size and / or process time. Another thing I would definitely try is to use Data Model to load the data rather than keep it in the file. You would be,
1. connect to source data file thru Power Query
2. filter away any columns / rows that are not needed
3. load the data to model
4. make pivots from it
This would reduce the file size while providing all the answers you need.
Give it a try. See this video for some help - https://www.youtube.com/watch?v=5u7bpysO3FQ
Normally when Excel processes data it utilizes all four cores on a processor. Is it true that Excel reduces to only using two cores When calculating tables? Same issue if there were two cores present, it would reduce to one in a table?
I ask because, I have personally noticed when i use tables the data is much slower than if I would have filtered it. I like tables for obvious reasons when working with datasets. Is this true.
John:
I don't know if it is true that Excel Table processing only uses 2 threads/cores, but it is entirely possible. The program has to be enabled to handle multiple parallel threads. Excel Lists/Tables were added long ago, at a time when 2 processes was a reasonable upper limit. And, it could be that there simply is no way to program table processing to use more than 2 threads at a time...
When I've got a large data set, I will set my Excel priority to High thru Task Manager to allow it to use more available processing. Never use RealTime priority or you're completely locked up until Excel finishes.
That is a good tip Jen...