Podcast: On Calculations

Posted on 2 CommentsPosted in Podcasts

This week Wilson and I discussed Calculations. It’s a big topic that we get asked about constantly by newer users trying to really be effective with Tableau. Wait a minute, you might say, isn’t Tableau supposed to make things easy? Why do I have to know advanced math to do analysis? I thought this was for business users! Good point, disembodied typist. One of the reasons Calcs are so prominent in the mindspace surrounding Tableau is because of misconceptions. Another reason is that calcs are used more for things like formatting and categorization than they are for hard math. Or as my colleagues in London would say, maths. We also discussed a framework for learning calculations better that focuses on the key concepts you need to understand when working with calcs in Tableau. I will probably write that up soon. For now, listen and enjoy the episode. -CS

The Subject-Oriented Data Mart

Posted on Leave a commentPosted in Uncategorized

In my last post I mentioned the “subject-oriented data mart” as a useful concept to understand when thinking about Tableau Server. This is not a new term – the idea of a Data Mart has been present for decades in the form of small databases, cubes, or even applications that are built for the exploration or analysis of a particular subset of an Enterprise’s data. However, I think the term takes on new meaning and importance when applied to the concept of Tableau. A subject-oriented data mart both aids in logistical challenges with the structure and exploration of heterogeneous data and allows businesses to put rules in place. This allows everyone to explore and make decisions based on data without the harsh limitations that keep people from real insight. Common Challenges You probably already know the challenge – business people turn to tools like Tableau because traditional reporting through B.I. platforms doesn’t answer their questions. Tableau offers the flexibility that they are used to experiencing with tools like Excel, but business users often prefer it because of its ease of use and ability to make beautiful graphics. Technology teams like the idea that Tableau ties back to their data warehouse, […]

Optimizing Extract Load Times

Posted on Leave a commentPosted in Tableau Tips and Tricks

In our most recent episode, I introduced a challenge that a customer of mine is currently trying to wrap their head around – how to optimize data performance in a self-reliant analysis environment. How can an organization empower its employees to answer any data-oriented question that pops into their head without requiring them to deal with the difficult and often time consuming processes around staging and loading data? Well, I have a few suggestions: 1. Use Tableau Data Server   Tableau Data Server in and of itself does not do a lot to help performance but the workflows it enables are crucial to optimizing data load times. You may already know that Tableau Server can schedule data extracts to refresh automatically. Well, utilizing the data server allows those automatically refreshing extracts to source multiple workbooks, for multiple users. All Tableau visualizations on the server that use the same data extracts can point to the same extract. This has a lot of implications. But for the purpose of this conversation it stages the data that everyone might want in an optimized format so they don’t have to go create their own extracts. It also severely reduces the load on the server […]

Tales From the Field

Posted on Leave a commentPosted in Podcasts

Wilson and I spend a lot of time off the air talking about what we’re working on. That was actually the inspiration for this podcast – the long rambling conversations we would often have (usually over drinks) about how things work at Tableau, and how things should be done. Wilson plans to follow up with a post about the work he’s doing in tracking his sales pipeline. This is crucial stuff for anyone who is analyzing Salesforce data. I hope he does it guys. I really do. Also, Wilson is mega lazy. He might not. I want to write a little more about optimizing Data performance in Tableau. That should be dropping shortly after this post. Thanks for listening. CS

Makeover Monday: Police Violence

Posted on Leave a commentPosted in Makeover Monday

In this week’s Makeover Monday Challenge, Andy found a data set on mappingpoliceviolence.org. The article and corresponding visualizations discuss police killings in America’s 60 largest police departments. It is an interesting article and the charts presented aren’t poor by any means. But I think they can be made  better by thinking about the story the article wants to tell. First of all, the visualizations compare police killing rates to a national average, but you have to search through the chart to find the bar that represents an average. This can be made better with a simple reference line. Second, the article calls out which police departments disproportionately kill black citizens. To do this, the same data is presented again in crosstab form with an icon calling out cities where 100% of deaths were black. I decided that consolidating those points onto a single chart would make more sense. Finally, there is comparison to violent crime rates in each town. This is an interesting point but confused by the use of a dual-axis chart with different axis ranges and unit types. A best practice with correlative analysis is a scatter plot. If your question is “how much does number A effect […]