• Hi All

    Please note that at the Chandoo.org Forums there is Zero Tolerance to Spam

    Post Spam and you Will Be Deleted as a User

    Hui...

  • When starting a new post, to receive a quicker and more targeted answer, Please include a sample file in the initial post.

Subtotal with power query

Kamo

Member
Hi,

II want to make the result in the attached file with power query or Pivot table.

Is it possible?

Thank you in advance for your help

Sincerely
 

Attachments

  • Subtotal PQ_.xlsx
    27.6 KB · Views: 14
Yes, but not easily. A simple pivot of the data is likely to give you what you want.
In the attached is a Power Query query at cell A29, based on the table at cell L9.
Bear in mind, I've tried to replicate the subtotal operation which means that the results operate only on a change in Week Nb and, like the subtotal operation, ignores all other columns and is very dependent on the order of the source table, so you must make sure the table is sorted as you want it first.
 

Attachments

  • Chandoo46241Subtotal PQ_.xlsx
    37.6 KB · Views: 19
Hi p45cal :),

Thank you very much p45 for your solution which works well and I would not have succeeded alone.

I knew it would not be easy with power query and I will try to understand each step of the M code

Indeed, it is easier with a table pivot but the presentation would not be exactly the same.

Thanks again to you

Have a nice day
 
Please try


Code:
let
    Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
    Grouped = Table.Group(Source, {"Emp ID", "Month", "Week Nb"}, {{"A", each Table.InsertRows(_,Table.RowCount(_),
        {[Emp ID=null,Month=null,Week Nb =Text.From(List.Average([Week Nb]))&" Average",Value=List.Average([Value])]})}}),
    Combine = Table.Combine(Grouped[A]),
    GTotal = Table.InsertRows(Combine,Table.RowCount(Combine),{[Emp ID=null,Month=null,Week Nb ="Grand Average",Value=List.Average(Source[Value])]})
in
    GTotal
 

Attachments

  • Subtotal PQ_.xlsx
    36.5 KB · Views: 28
I really wouldn't do this in PQ. While it is possible, it really defeats the purpose of using PQ.
Edit: Grammer: don't -> wouldn't.
Though it requires slightly different column order and structure. It's best to do this sort of aggregation in Pivot Table.

1. Add Index column (this is needed to distinguish each line as distinct item). This can be done at source stage or in PQ. I'd recommend using PQ for ease of management.
2. Load to data model and generate pivot based on it.
3. Use tabular layout, add [Emp ID], [Month], [Week Nb] & [Index] to row label. Add avg of value as value field. You can adjust row label field order to suite your need. Then add subtotal to [Week Nb] field.

Ex: Using [Week Nb] as top layer.
74488
 
Last edited:
@Excel Wizard, very neat indeed!
A suggested tweak: You've got the same result, but it's not quite the same as the subtotal operation because you've grouped by three fields instead of just Week Nb. So if the Emp ID and the Month were all the same and the Week Nb were as:
74489
the OP's subtotal operation would get you the same averages and grouping, but your amazingly short M-code would bring the Week Nb's 4s all together.
A change from:
Code:
Grouped = Table.Group(Source, {"Emp ID", "Month", "Week Nb"}, {{"A", each Table.InsertRows(_,Table.RowCount(_),
        {[Emp ID=null,Month=null,Week Nb =Text.From(List.Average([Week Nb]))&" Average",Value=List.Average([Value])]})}}),
to:
Code:
Grouped = Table.Group(Source, {"Week Nb"}, {{"A", each Table.InsertRows(_,Table.RowCount(_),
        {[Emp ID=null,Month=null,Week Nb =Text.From(List.Average([Week Nb]))&" Average",Value=List.Average([Value])]})}},GroupKind.Local),
seems to do it.
 
Last edited:
Hi Excel Wizard, Chihiro, p45cal,

@Excel Wizard , I also thank you for your help and this solution. Very interesting to have another approach and to do with almost only with one or two lines of code.

@p45cal , thanks for your advice and the reminder of the GroupKind.Local argument in the code.


@Chihiro, thanks to you also for the solution with Pivot Table.
You are right, this kind of table is rather reserved for pivot table.
I followed your steps and managed to get the same result as yours.

Finally, the Index column doesn't bother me in the power pivot and it is a very good solution.
I think that with several tens of thousands of rows in the source, power query would be less efficient and much slower than Pivot table.
But I know that there are solutions to have a faster M code for example with Table.Buffer but I think it will always be less fast than a Pivot table

I learned several ways and thanks to for your help

Have a nice evening
 
Not that I would ever do this, but just wanted to offer and share an all UI solution.
  1. Load raw data
  2. Reference this query and create the subtotal averages (you could re-use part of the solution of @p45cal and avoid column duplication)
  3. Reference raw data again and create grand total average (you could re-use part of the solution of @p45cal and avoid column duplication)
  4. append all queries, and sort.
Always nice to learn from others over here. kudos to @p45cal (I can read that) and @Excel Wizard (I need to study that).
 

Attachments

  • Copy of Subtotal PQ_.xlsx
    38.4 KB · Views: 10
Hi GraH-Guido :)
Thank you Graha for your solution, I now have lots of choice

It takes a bit longer going through 4 querries but the advantage is that we don't need to manipulate M

Otherwise, I tested all the solutions with a source of 36 000 rows, finally the 1st update takes a little time (10 sec) but then it is fast (2 or 3 sec in average)

Thanks again to all of you

Have a nice day
 
In theorie referencing a Query is like doing a single Query. It is almost like referring to ( one of the or ) a previous step.
I could write the code so it is one Query, though I seriously doubt it would change a lot in performance. It is an area I need to dig in much more.
 
Hi p45cal, Chihiro, Excel Wizard, GraH - Guido :),

Sorry for my late reply,
I realised that my first file was not representative and I apologise for misdirecting you.

Here is a new file with 38,000 rows in which I have indicated the desired result.
I have put 2 displays, I would be curious to have please a solution for each of the 2 tables if possible.

Thank you in advance for your help.
 

Attachments

  • Average.xlsx
    748 KB · Views: 8
Try


Code:
let
    Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
    GroupedWeek = Table.Group(Source, {"Emp ID", "Month","Week Nb"}, {"Value", each List.Sum([Value])}),
    GroupedMonth = Table.Group(GroupedWeek, {"Emp ID", "Month"}, {{"A", each Table.InsertRows(_,Table.RowCount(_),
        {[Emp ID=[Emp ID]{0},Month=Text.From([Month]{0})&" Total",Week Nb =null,Value=List.Sum([Value])],
        [Emp ID=[Emp ID]{0},Month=Text.From([Month]{0})&" Average",Week Nb =null,Value=List.Average([Value])]})}}),
    Combine = Table.Combine(GroupedMonth[A])
in
    Combine


Code:
let
    Source = Excel.CurrentWorkbook(){[Name="Table1"]}[Content],
    GroupedWeek = Table.Group(Source, {"Emp ID", "Month","Week Nb"}, {"Value", each List.Sum([Value])}),
    GroupedMonth = Table.Group(GroupedWeek, {"Emp ID", "Month"}, {{"A", each Table.InsertRows(Table.AddColumn(_,"Average",each null),Table.RowCount(_),
        {[Emp ID=[Emp ID]{0},Month=Text.From([Month]{0})&" Total",Week Nb =null,Value=List.Sum([Value]),Average=List.Average([Value])]})}}),
    Combine = Table.Combine(GroupedMonth[A]),
    Renamed = Table.RenameColumns(Combine,{{"Value", "Sum"}})
in
    Renamed
 

Attachments

  • Average.xlsx
    757.3 KB · Views: 19
Hi Excel Wizard :),
Too strong and congratulations for your mastery and this efficient solution
I thought I was going to have a power pivot solution but it's also perfect with power query
Thanks a lot

Good evening everyone
 
I'd be curious to know a bit more about timings, could you attach the 36000 row data?
I didn't test with a timer, but I couldn't discern an update time difference between the solutions in msgs #2 & #4 when working on the larger data source - quite unexpected.
 
Hi p45cal,

I just tested my 38000 rows file #13 with codes #2 and #4 and this time it is very fast.
I didn't understand why the refresh was slow the first time the other day.

Sorry for my erroneous message #9

Have a nice evening
 
Back
Top