Let’s say you made a chart to show actual and forecast values. By default, both values look in same color. But we would like to separate forecast values by showing them in another color.
If you are a seasoned Excel user, you may be thinking, “Oh, that’s easy. I will just create 2 sets of data (one for actual and one for forecast), make a chart from them and apply separate colors.”
But here is a really simple way to get the same effect.
Use a semi-transparent box to mask the forecast values. The end result is shown below.

Here is how the trick works:
- Create the chart from all values.
- Draw a rectangle (box) shape on your spreadsheet.
- Fill it with white color and remove outline (set the outline color to no line).
- Select the box, Go to Fill > more colors and set it to 50% transparent.

- Place the box on top of chart, adjust its size and position to overlap the forecast data.
- Your forecast looks in a different color!
See below demo to understand the process:

Learn more about forecasting
If your work involves trend analysis & forecasting, check out below resources:
- Introduction to trend analysis in Excel – podcast
- Doing trend analysis & forecasting in Excel – 3 part series
- How to highlight best months & weeks in charts
How do you highlight your forecasts?
My personal favorite is to use dotted lines to separate forecasts. This involves either using Excel’s chart trendline option or adding a dummy series thru formulas to show the forecast line. When I am in a hurry, I usually add a semi-transparent mask to set aside the forecast values.
What about you? How do you highlight forecast values in your charts? Please post your technique in the comments area.
















One Response to “SQL vs. Power Query – The Ultimate Comparison”
Enjoyed your SQL / Power Query podcast (A LOT). I've used SQL a little longer than Chandoo. Power Query not so much.
Today I still use SQL & VBA for my "go to" applications. While I don't pull billions of rows, I do pull millions. I agree with Chandoo about Power Query (PQ) lack of performance. I've tried to benchmark PQ to SQL and I find that a well written SQL will work much faster. Like mentioned in the podcast, my similar conclusion is that SQL is doing the filtering on the server while PQ is pulling data into the local computer and then filtering the data. I've heard about PQ query folding but I still prefer SQL.
My typical excel application will use SQL to pull data from an Enterprise DB. I load data into Structured Tables and/or Excel Power Pivot (especially if there's lot of data).
I like to have a Control Worksheet to enter parameters, display error messages and have user buttons to execute VBA. I use VBA to build/edit parameters used in the SQL. Sometimes I use parameter-based SQL. Sometimes I create a custom SQL String in a hidden worksheet that I then pull into VBA code (these may build a string of comma separated values that's used with a SQL include). Another SQL trick I like to do is tag my data with a YY-MM, YY-QTR, or YY-Week field constructed form a Transaction Date.
In an application, I like to create a dashboard(s) that may contain hyperlinks that allow the end-user to drill into data. Sometimes the hyperlink will point to worksheet and sometimes to a supporting workbook. In some cases, I use a double click VBA Macro that will pull additional data and direct the user to a supplemental worksheet or pivot table.
In recent years I like Dynamic Formulas & Lambda Functions. I find this preferable to pivot tales and slicers. I like to use a Lambda in conjunction with a cube formula to pull data from a power pivot data model. I.E. a Lambda using a cube formula to aggregate Accounting Data by a general ledger account and financial period. Rather than present info in a power pivot table, you can use this combination to easily build financial reports in a format that's familiar to Accounting Professionals.
One thing that PQ does very well is consolidating data from separate files. In the old days this was always a pain.
I've found that using SQL can be very trying (even for someone with experience). It's largely an iterative process. Start simple then use Xlookup (old days Match/Index). Once you get the relationships correct you can then use SQL joins to construct a well behaved SQL statement.
Most professional enterprise systems offer a schema that's very valuable for constructing SQL statements. For any given enterprise system there's often a community of users that will share SQL. I.E. MS Great Plains was a great source (but I haven't used them in years).
Hope this long reply has value - keep up the good work.