In the 19th session of Chandoo.org podcast, lets talk about modeling best practices.

What is in this session?
I am very happy to interview my good friend, blogger, author, excel trainer & business-women – Danielle Stein Fairhurst for this session. I first met Danielle when I went to Sydney, Australia in April 2012. Our friendship & collaboration grew a lot in the last 2.5 years. She is a great speaker & trainer. This episode is loaded with her trademark style commentary, explanation & tips for better modeling. I hope you will enjoy it.
In this podcast, you will learn,
- Introduction to Danielle & her work
- 6 Tips for Best Practice Modeling
- Write consistent formulas
- Avoid hard-coding
- Smart referencing
- Ditch the bad habits
- Document assumptions
- Format & label things
- Resources for learning more
Go ahead and listen to the show
Podcast: Play in new window | Download
Subscribe: RSS
Links & Resources mentioned in this session:
Download Example Workbook
Please download the example workbook Danielle created to understand these tips.
About Danielle
- Visit Plum Solutions: her website for more tips, tutorials & articles on modeling, data analysis & Excel
- Get Danielle’s book: its great for both aspiring & working analysts (read my review)
- Join Danielle’s linked group: and network with fellow financial modelers & professionals
Learn how to create financial models in Excel
On Modeling Best Practices
- 5 tips on modeling best practices
- BASE rule for keeping your models simple
- 12 rules for making better Excel models
- 10 tips for better workbooks
- Introduction to Spreadsheet Risk Management
Other topics relevant to the podcast:
Transcript of this session:
Download this podcast transcript [PDF].
What keeps your models sane & sexy?
I use all the tips recommended by Danielle. Apart from these, I also use ideas like named ranges, structural references, separation of input & output to keep my models user friendly.
What about you? Do you apply the tips suggested by Danielle? What else do you use to make your models awesome? Please share your tips & ideas using comments.
















One Response to “SQL vs. Power Query – The Ultimate Comparison”
Enjoyed your SQL / Power Query podcast (A LOT). I've used SQL a little longer than Chandoo. Power Query not so much.
Today I still use SQL & VBA for my "go to" applications. While I don't pull billions of rows, I do pull millions. I agree with Chandoo about Power Query (PQ) lack of performance. I've tried to benchmark PQ to SQL and I find that a well written SQL will work much faster. Like mentioned in the podcast, my similar conclusion is that SQL is doing the filtering on the server while PQ is pulling data into the local computer and then filtering the data. I've heard about PQ query folding but I still prefer SQL.
My typical excel application will use SQL to pull data from an Enterprise DB. I load data into Structured Tables and/or Excel Power Pivot (especially if there's lot of data).
I like to have a Control Worksheet to enter parameters, display error messages and have user buttons to execute VBA. I use VBA to build/edit parameters used in the SQL. Sometimes I use parameter-based SQL. Sometimes I create a custom SQL String in a hidden worksheet that I then pull into VBA code (these may build a string of comma separated values that's used with a SQL include). Another SQL trick I like to do is tag my data with a YY-MM, YY-QTR, or YY-Week field constructed form a Transaction Date.
In an application, I like to create a dashboard(s) that may contain hyperlinks that allow the end-user to drill into data. Sometimes the hyperlink will point to worksheet and sometimes to a supporting workbook. In some cases, I use a double click VBA Macro that will pull additional data and direct the user to a supplemental worksheet or pivot table.
In recent years I like Dynamic Formulas & Lambda Functions. I find this preferable to pivot tales and slicers. I like to use a Lambda in conjunction with a cube formula to pull data from a power pivot data model. I.E. a Lambda using a cube formula to aggregate Accounting Data by a general ledger account and financial period. Rather than present info in a power pivot table, you can use this combination to easily build financial reports in a format that's familiar to Accounting Professionals.
One thing that PQ does very well is consolidating data from separate files. In the old days this was always a pain.
I've found that using SQL can be very trying (even for someone with experience). It's largely an iterative process. Start simple then use Xlookup (old days Match/Index). Once you get the relationships correct you can then use SQL joins to construct a well behaved SQL statement.
Most professional enterprise systems offer a schema that's very valuable for constructing SQL statements. For any given enterprise system there's often a community of users that will share SQL. I.E. MS Great Plains was a great source (but I haven't used them in years).
Hope this long reply has value - keep up the good work.