Lenient lookup [Advanced Formula Trick]

Share

Facebook
Twitter
LinkedIn

We all know VLOOKUP (or INDEX+MATCH) as an indispensable tool in our Excel toolbox. But what if you want the lookups to be a little gentler, nicer and relaxed?

Let’s say you want to lookup the amount $330.50 against a list of payments. There is no exact match, but if we look 50 cents in either direction, then we can find a match. Here is a demo of what I mean.

demo of lenient (flexible) lookup

Unfortunately, you can’t convince VLOOKUP to act nice.

Hey VLOOKUP, I know you are awesome and all, but can you cut me some slack here? 

VLOOKUP is tough, reliable and has a cold heart. Or is it?

In this post, let’s learn how to do lenient lookups.

Data for the problem

Let’s say you have a simple 2 column table like this. Our table is uninspiringly named data.

Data for lenient lookup

Lenient lookup – setting up the formula

Our input amount is in cell C3.

Let’s say when looking up for the amount, we want to follow this logic.

  1. If an exact match is found, return that
  2. Else, see if we can find anything with in 50 cents either side (you can change 50 to whatever you want)
  3. If nothing can be found, we want to return “Not found” or similar message

Formulas to use:

1: we can use good old INDEX+MATCH

2: we can use array based INDEX+MATCH

3: we can use IFERROR.

Let’s put everything together.

Our lenient lookup formula (array):

=IFERROR( INDEX(data[Client], IFERROR(MATCH($C$3,data[Amount],0), MATCH(1, (data[Amount]>($C$3-0.5))*(data[Amount]<($C$3+0.5)),0) ))
,"Not found")

How does it work?

Let’s go inside out.

MATCH($C$3,data[Amount],0): this formula simply looks for C3 in data[Amount] column and returns the position.

MATCH(1, (data[Amount]>($C$3-0.5))*(data[Amount]<($C$3+0.5)),0): This array formula checks for 1 (TRUE) by looking at data[Amount] between C3-0.5 and C3+0.5

The formula has two Boolean arrays multiplied and it returns a bunch of 1s & 0s.

MATCH then picks up the first such amount.

Inner IFERROR(MATCH(…), MATCH(…)): This acts like a fail-safe switch. If there is no exact match (first one), then lenient match (second one) will be used.

Outer IFERROR(): If no matches are found (exact or lenient) then “Not found” will be printed.

As this is an array formula, you need to press CTRL+Shift+Enter to get the result.

Related material – read these if you have questions about the formula techniques used above:

 

Other lenient / almost lookup problems

There are few more variations to this technique. Let’s review them.

Note: all of these are array formulas, so press CTRL+Shift+Enter.

Ignore decimal portion

We lookup just the whole number portion of the value to find match.

Formula: =INDEX(data[Client], MATCH(G7, INT(data[Amount]),0))

Notes on how it works:

  • INT() turns data[Amount] column to whole numbers.
  • We then lookup the amount (G7) and return the match

Amount is at least something, client name begins with S

Formula: =INDEX(data[Client], MATCH(1, (data[Amount]>=G8)*(LEFT(data[Client],1)=”S”),0))

  • We use a different Boolean structure with >= and LEFT() formulas. The output will be a bunch of 1s & 0s.
  • INDEX+MATCH for find the first such value (G8)

Closest Amount to input

This is interesting. We use MIN & ABS to find closest amount to input value (G10) and return the client’s name.

Formula: =INDEX(data[Client], MATCH(MIN(ABS(data[Amount]-G10)), ABS(data[Amount]-G10),0))

  • ABS(data[Amount]-G10) gives a bunch of absolute (positive) values. The smallest of these will closest to G10.
  • MIN() finds the smallest value
  • MATCH looks up the minimum value from ABS(data[Amount]-G10)
  • INDEX gives corresponding client’s name

Download lenient lookup example workbook

Click here to download the example workbook. The file contains sample data, several examples of these techniques and additional resources to learn. Give it a go.

More ways to lookup

Looking up when data won't co-operate - Matrix lookup with a twistLookups are an essential part of any data analysis work you do in Excel. Pick up some nifty tricks from these links.

Got a lookup tip to share?

Have some lookup stories to tell? I am listening. Please post them in comments.

Facebook
Twitter
LinkedIn

Share this tip with your colleagues

Excel and Power BI tips - Chandoo.org Newsletter

Get FREE Excel + Power BI Tips

Simple, fun and useful emails, once per week.

Learn & be awesome.

Welcome to Chandoo.org

Thank you so much for visiting. My aim is to make you awesome in Excel & Power BI. I do this by sharing videos, tips, examples and downloads on this website. There are more than 1,000 pages with all things Excel, Power BI, Dashboards & VBA here. Go ahead and spend few minutes to be AWESOME.

Read my storyFREE Excel tips book

Overall I learned a lot and I thought you did a great job of explaining how to do things. This will definitely elevate my reporting in the future.
Rebekah S
Reporting Analyst
Excel formula list - 100+ examples and howto guide for you

From simple to complex, there is a formula for every occasion. Check out the list now.

Calendars, invoices, trackers and much more. All free, fun and fantastic.

Advanced Pivot Table tricks

Power Query, Data model, DAX, Filters, Slicers, Conditional formats and beautiful charts. It's all here.

Still on fence about Power BI? In this getting started guide, learn what is Power BI, how to get it and how to create your first report from scratch.

One Response to “SQL vs. Power Query – The Ultimate Comparison”

  1. Jim Kuba says:

    Enjoyed your SQL / Power Query podcast (A LOT). I've used SQL a little longer than Chandoo. Power Query not so much.

    Today I still use SQL & VBA for my "go to" applications. While I don't pull billions of rows, I do pull millions. I agree with Chandoo about Power Query (PQ) lack of performance. I've tried to benchmark PQ to SQL and I find that a well written SQL will work much faster. Like mentioned in the podcast, my similar conclusion is that SQL is doing the filtering on the server while PQ is pulling data into the local computer and then filtering the data. I've heard about PQ query folding but I still prefer SQL.

    My typical excel application will use SQL to pull data from an Enterprise DB. I load data into Structured Tables and/or Excel Power Pivot (especially if there's lot of data).

    I like to have a Control Worksheet to enter parameters, display error messages and have user buttons to execute VBA. I use VBA to build/edit parameters used in the SQL. Sometimes I use parameter-based SQL. Sometimes I create a custom SQL String in a hidden worksheet that I then pull into VBA code (these may build a string of comma separated values that's used with a SQL include). Another SQL trick I like to do is tag my data with a YY-MM, YY-QTR, or YY-Week field constructed form a Transaction Date.

    In an application, I like to create a dashboard(s) that may contain hyperlinks that allow the end-user to drill into data. Sometimes the hyperlink will point to worksheet and sometimes to a supporting workbook. In some cases, I use a double click VBA Macro that will pull additional data and direct the user to a supplemental worksheet or pivot table.

    In recent years I like Dynamic Formulas & Lambda Functions. I find this preferable to pivot tales and slicers. I like to use a Lambda in conjunction with a cube formula to pull data from a power pivot data model. I.E. a Lambda using a cube formula to aggregate Accounting Data by a general ledger account and financial period. Rather than present info in a power pivot table, you can use this combination to easily build financial reports in a format that's familiar to Accounting Professionals.

    One thing that PQ does very well is consolidating data from separate files. In the old days this was always a pain.

    I've found that using SQL can be very trying (even for someone with experience). It's largely an iterative process. Start simple then use Xlookup (old days Match/Index). Once you get the relationships correct you can then use SQL joins to construct a well behaved SQL statement.

    Most professional enterprise systems offer a schema that's very valuable for constructing SQL statements. For any given enterprise system there's often a community of users that will share SQL. I.E. MS Great Plains was a great source (but I haven't used them in years).

    Hope this long reply has value - keep up the good work.

Leave a Reply