Professional Resume or Data Visualization Fail? [poll]

Share

Facebook
Twitter
LinkedIn

Michael Anderson,  a web designer has posted this delicious looking visual resume [full image]

While the resume looks stunning at a glance, a closer inspection reveals that you cant really make any valuable conclusions about Michael’s past experience and qualifications. Of course if the purpose of this resume is to show that he is a fabulous designer, then the resume definitely achieved that. It has got way better presentation that lots of professional resumes out there.

It uses some of the more flamboyant and often avoided chart types like area chart, 3d area chart and a 3d donut area chart (oh dear God !)

Here are few things that I think are wrong with this data visualization:

  • In-consistent color: The colors don’t convey any particular message. Especially, given the fact that he repeated the colors. Same color means coffee, layout design and sign-shop work experience. One of the primary rules of data visualization in dashboards is to use color for repetition. For eg. using one color for each product.
  • Poor choice of charts: While 3d charts look great, they are not the best ones to describe real information. Instead of 3d area charts and 3d donut area charts, a better choice would have been to use bar charts. They are simple, elegant and convey rich information very easily. Hey, you can make eye candy using bars too.
  • Irrelevant Data: If I am someone planning to hire Michael, I would definitely be more interested in what great kickass stuff he has done (and I am sure he has done stuff like that, looking at this) than how much coffee he takes each day (and still I cant figure out how many cups he drinks, thanks to weird chart selection)
  • Not showing the numbers: As Anderson said in his post “[T]his is just concept art, as there are almost no real metrics represented except for time.” and I guess, this comment doesn’t apply. We all know that resumes work well, when they talk numbers (made 500 XHTML compatible pages in 50 hours, 25 magazine cover designs, 500k downloads for my icon library etc.), unfortunately Michael missed on that totally. One can assume any number of things about his work in “the sign shop” or “Comor inc.”

What are your thoughts on this data visualization? Awesome or awful ?

Thanks to Manoj for sharing the link via e-mail

Facebook
Twitter
LinkedIn

Share this tip with your colleagues

Excel and Power BI tips - Chandoo.org Newsletter

Get FREE Excel + Power BI Tips

Simple, fun and useful emails, once per week.

Learn & be awesome.

Welcome to Chandoo.org

Thank you so much for visiting. My aim is to make you awesome in Excel & Power BI. I do this by sharing videos, tips, examples and downloads on this website. There are more than 1,000 pages with all things Excel, Power BI, Dashboards & VBA here. Go ahead and spend few minutes to be AWESOME.

Read my storyFREE Excel tips book

Overall I learned a lot and I thought you did a great job of explaining how to do things. This will definitely elevate my reporting in the future.
Rebekah S
Reporting Analyst
Excel formula list - 100+ examples and howto guide for you

From simple to complex, there is a formula for every occasion. Check out the list now.

Calendars, invoices, trackers and much more. All free, fun and fantastic.

Advanced Pivot Table tricks

Power Query, Data model, DAX, Filters, Slicers, Conditional formats and beautiful charts. It's all here.

Still on fence about Power BI? In this getting started guide, learn what is Power BI, how to get it and how to create your first report from scratch.

8 Responses to “Pivot Tables from large data-sets – 5 examples”

  1. Ron S says:

    Do you have links to any sites that can provide free, large, test data sets. Both large in diversity and large in total number of rows.

    • Chandoo says:

      Good question Ron. I suggest checking out kaggle.com, data.world or create your own with randbetween(). You can also get a complex business data-set from Microsoft Power BI website. It is contoso retail data.

  2. Steve J says:

    Hi Chandoo,
    I work with large data sets all the time (80-200MB files with 100Ks of rows and 20-40 columns) and I've taken a few steps to reduce the size (20-60MB) so they can better shared and work more quickly. These steps include: creating custom calculations in the pivot instead of having additional data columns, deleting the data tab and saving as an xlsb. I've even tried indexmatch instead of vlookup--although I'm not sure that saved much. Are there any other tricks to further reduce the file size? thanks, Steve

    • Chandoo says:

      Hi Steve,

      Good tips on how to reduce the file size and / or process time. Another thing I would definitely try is to use Data Model to load the data rather than keep it in the file. You would be,
      1. connect to source data file thru Power Query
      2. filter away any columns / rows that are not needed
      3. load the data to model
      4. make pivots from it

      This would reduce the file size while providing all the answers you need.

      Give it a try. See this video for some help - https://www.youtube.com/watch?v=5u7bpysO3FQ

  3. John Price says:

    Normally when Excel processes data it utilizes all four cores on a processor. Is it true that Excel reduces to only using two cores When calculating tables? Same issue if there were two cores present, it would reduce to one in a table?
    I ask because, I have personally noticed when i use tables the data is much slower than if I would have filtered it. I like tables for obvious reasons when working with datasets. Is this true.

    • Ron MVP says:

      John:
      I don't know if it is true that Excel Table processing only uses 2 threads/cores, but it is entirely possible. The program has to be enabled to handle multiple parallel threads. Excel Lists/Tables were added long ago, at a time when 2 processes was a reasonable upper limit. And, it could be that there simply is no way to program table processing to use more than 2 threads at a time...

  4. Jen says:

    When I've got a large data set, I will set my Excel priority to High thru Task Manager to allow it to use more available processing. Never use RealTime priority or you're completely locked up until Excel finishes.

Leave a Reply