When we have lots of data, we try to summarize it by calculating the average. As they say, averages are mean, they do not give away much.
I want to share with you an interesting example from Amazon.com on how they give more details by combining average with distribution.
As you might know, Amazon shows the rating of each of the products they sell. Customers & users rate the products from 1 to 5 stars. When you visit the product page you will see the average rating. But there is a small down-arrow next to it. When you click on it, Amazon shows you the break-up of that rating so you have a better idea of how the ratings are split.

Why show distributions?
Showing distribution of values corresponding to the average reveals important information about the data. We tend to use averages alone since they take very little time to compute and very little space to show. But adding the ability to show distribution of values (on demand) is a powerful way to let end-users understand the data better. [related: calculating frequency distributions in excel]
Below, I have shown a demo of how you can do this using Excel. Tomorrow, I will write a tutorial explaining the same.
On-demand Details in Excel Charts – a Demo:

What is your Opinion – Averages / Distributions / Both or Something else?
I prefer to use simple metrics like average, median, sum, count, min or max at the high-level in my dashboards or charts. But I always add extra detail by showing distribution of values or break-up by a different parameter so that my audience can understand the outputs better.
What about you? What metrics do you use at high-level and how do you add detail? Please share your thoughts and techniques using comments.
More Charting Principles:
There are tons of very good charting examples around us. Once in a while, I write about these on chandoo.org. I recommend reading the below articles if you make charts,
















One Response to “SQL vs. Power Query – The Ultimate Comparison”
Enjoyed your SQL / Power Query podcast (A LOT). I've used SQL a little longer than Chandoo. Power Query not so much.
Today I still use SQL & VBA for my "go to" applications. While I don't pull billions of rows, I do pull millions. I agree with Chandoo about Power Query (PQ) lack of performance. I've tried to benchmark PQ to SQL and I find that a well written SQL will work much faster. Like mentioned in the podcast, my similar conclusion is that SQL is doing the filtering on the server while PQ is pulling data into the local computer and then filtering the data. I've heard about PQ query folding but I still prefer SQL.
My typical excel application will use SQL to pull data from an Enterprise DB. I load data into Structured Tables and/or Excel Power Pivot (especially if there's lot of data).
I like to have a Control Worksheet to enter parameters, display error messages and have user buttons to execute VBA. I use VBA to build/edit parameters used in the SQL. Sometimes I use parameter-based SQL. Sometimes I create a custom SQL String in a hidden worksheet that I then pull into VBA code (these may build a string of comma separated values that's used with a SQL include). Another SQL trick I like to do is tag my data with a YY-MM, YY-QTR, or YY-Week field constructed form a Transaction Date.
In an application, I like to create a dashboard(s) that may contain hyperlinks that allow the end-user to drill into data. Sometimes the hyperlink will point to worksheet and sometimes to a supporting workbook. In some cases, I use a double click VBA Macro that will pull additional data and direct the user to a supplemental worksheet or pivot table.
In recent years I like Dynamic Formulas & Lambda Functions. I find this preferable to pivot tales and slicers. I like to use a Lambda in conjunction with a cube formula to pull data from a power pivot data model. I.E. a Lambda using a cube formula to aggregate Accounting Data by a general ledger account and financial period. Rather than present info in a power pivot table, you can use this combination to easily build financial reports in a format that's familiar to Accounting Professionals.
One thing that PQ does very well is consolidating data from separate files. In the old days this was always a pain.
I've found that using SQL can be very trying (even for someone with experience). It's largely an iterative process. Start simple then use Xlookup (old days Match/Index). Once you get the relationships correct you can then use SQL joins to construct a well behaved SQL statement.
Most professional enterprise systems offer a schema that's very valuable for constructing SQL statements. For any given enterprise system there's often a community of users that will share SQL. I.E. MS Great Plains was a great source (but I haven't used them in years).
Hope this long reply has value - keep up the good work.