Recently, I saw this chart on Economist website.
It is trying to depict how various cities rank on livability index and how they compare to previous ranking (2014 vs 2009).

As you can see, this chart is not the best way to visualize “Best places to live”.
Few reasons why,
- The segregated views (blue, gray & red) make it hard to look for a specific city or region
- The zig-zag lines look good, but they are incredibly hard to understand
- Labels are all over the place, thus making data interpretation hard.
- Some points have no labels (or ambiguous labels) leading to further confusion.
After examining the chart long & hard, I got thinking.
Its no fun criticizing someones work. Creating a better chart from this data, now thats awesome.
So I went looking for the raw data behind this graphical mess. Turns out, Economist sells this data for a meager price of US $5,625.
Alas, I was saving my left kidney for something more prominent than a bunch of raw data in a workbook. May be if they had sparklines in the file…
So armed with the certainty that my kidney will stay with me, I now turned my attention to a similar data set.
I downloaded my website visitor city data for top 100 cities in September 2014 & September 2013 from Google Analytics.
And I could get it for exactly $0.00. Much better.
This data is similar to Economist data.

Chart visualizing top 100 cities
Here is a chart I prepared from this data.

This chart (well, a glorified table) not only allows for understanding all the data, but also lets you focus specific groups of cities (top % changes, new cities in the top 100, cities that dropped out etc.) with ease.
Download top 100 cities visualization – Excel workbook
Click here to download this workbook. Examine the formulas & formatting settings to understand how this is made.
How is this visualization made?
Here is a video explaining how the workbook is constructed. [see it on our YouTube channel]
The key techniques used in this workbook are,
- SUMIFS, INDEX + MATCH formulas for figuring out data
- Sorting data by a particular column
- Conditional formatting to show % change arrows
- Form controls for user interactivity
Since the process of creating this visualization is similar to some of the earlier discussed examples, I recommend you go thru below if you have difficulty understanding this workbook:
- Suicides vs. Murders – interactive Excel chart
- Gender Gap chart in Excel
- Visualizing world education rankings
- Analyzing survey results with panel charts
How would you visualize similar data?
Here is a fun thought experiment. How would you visualize such data? Please share your thoughts (or example workbooks) in the comments. I & rest of our readers are eager to learn from you.
















One Response to “SQL vs. Power Query – The Ultimate Comparison”
Enjoyed your SQL / Power Query podcast (A LOT). I've used SQL a little longer than Chandoo. Power Query not so much.
Today I still use SQL & VBA for my "go to" applications. While I don't pull billions of rows, I do pull millions. I agree with Chandoo about Power Query (PQ) lack of performance. I've tried to benchmark PQ to SQL and I find that a well written SQL will work much faster. Like mentioned in the podcast, my similar conclusion is that SQL is doing the filtering on the server while PQ is pulling data into the local computer and then filtering the data. I've heard about PQ query folding but I still prefer SQL.
My typical excel application will use SQL to pull data from an Enterprise DB. I load data into Structured Tables and/or Excel Power Pivot (especially if there's lot of data).
I like to have a Control Worksheet to enter parameters, display error messages and have user buttons to execute VBA. I use VBA to build/edit parameters used in the SQL. Sometimes I use parameter-based SQL. Sometimes I create a custom SQL String in a hidden worksheet that I then pull into VBA code (these may build a string of comma separated values that's used with a SQL include). Another SQL trick I like to do is tag my data with a YY-MM, YY-QTR, or YY-Week field constructed form a Transaction Date.
In an application, I like to create a dashboard(s) that may contain hyperlinks that allow the end-user to drill into data. Sometimes the hyperlink will point to worksheet and sometimes to a supporting workbook. In some cases, I use a double click VBA Macro that will pull additional data and direct the user to a supplemental worksheet or pivot table.
In recent years I like Dynamic Formulas & Lambda Functions. I find this preferable to pivot tales and slicers. I like to use a Lambda in conjunction with a cube formula to pull data from a power pivot data model. I.E. a Lambda using a cube formula to aggregate Accounting Data by a general ledger account and financial period. Rather than present info in a power pivot table, you can use this combination to easily build financial reports in a format that's familiar to Accounting Professionals.
One thing that PQ does very well is consolidating data from separate files. In the old days this was always a pain.
I've found that using SQL can be very trying (even for someone with experience). It's largely an iterative process. Start simple then use Xlookup (old days Match/Index). Once you get the relationships correct you can then use SQL joins to construct a well behaved SQL statement.
Most professional enterprise systems offer a schema that's very valuable for constructing SQL statements. For any given enterprise system there's often a community of users that will share SQL. I.E. MS Great Plains was a great source (but I haven't used them in years).
Hope this long reply has value - keep up the good work.