Leverage Dave's Twitter Analytics


This post provides fast & easy steps for leveraging Dave Andrade's Twitter Analytics Dashboard to make it your own. There's a bit of R code, and a repeatable process to update your view with new data as often as you like.

"Stealing like an Artist" is a widely accepted within the Tableau Community. So much so.. the Tableau Public Blog has a post by Hanne Løvik with instructions for how to reverse engineer a dashboard:

How to Steal from the Best

Just finishing a bit of burglary myself, I figure the best way to repay the community is to publish my steps. Now you can steal a little bit from both of us.

Let's Make This Quick

1. Download Dave's Workbook

Use the fancy new toolbar on the recently re-designed Tableau Public, and pull Dave's dashboard down to your hard drive.

2. Grab Your Data

From, you could get all the available data at once. But our goal is to build a repeatible pipeline. So let's begin that process now. Download one month at a time, each month to a separate CSV file.

3. Row Bind These Files Together

Here's a screenshot of my directory structure:

You can see that I am following a similar process for incoming data from other sources, as well, such as Google Analytics, SumAll, AddThis, etc.

My choice is to use R for the processing. You can use whatever tool you like. To simply leverage what I've done, set yourself up as follows:

  • ../data files/twitter-analytics
    • the monthly CSV exports
  • ../data files/_tidy
    • tidy output

# Twitter Analytics
# Export one data file per month, using default file name with a datestamp
tidy_twitter <- function() {

# global variables
base_dir <<- "/path/to/your/data files/"
tidy_dir <<- paste(base_dir,"_tidy/", sep="")
## set your local time zone carefully, for use with format(as.POSIXct)
local_time_zone <<- "US/Pacific"

## locations & files
twit_dir <- paste(base_dir,"twitter-analytics", sep="") # raw files directory
twit_files <- list.files(path=twit_dir, all.files = FALSE,full.names=TRUE ) # raw files list

# read data
twit_data <-, lapply(twit_files, read.table, header=TRUE, sep=",")) # rbind raw files

# pretty names
names(twit_data) <- gsub("\\.", " ", names(twit_data)) # replace . with " "

# de-dupe (in case files were downloaded with overlapping dates)
twit_data <- unique(twit_data)

# time zone conversion
twit_data$LocalTime <- as.POSIXct(twit_data$time,tz="GMT")
twit_data$LocalTime <- format(twit_data$LocalTime, tz=local_time_zone,usetz=FALSE)   

## write csv
twit_outfile <- paste(tidy_dir,"twitter-analytics.csv",sep="")


4. Replace the Data Source

  1. Open Dave's workbook in Tableau
  2. Create a new data source from your your combined CSV data
  3. Copy & Paste the DateTime calculation from Dave's data source into your own
  4. Replace Dave's original data source with your own
  5. Close the original data source
  6. Save the workbook

5. Now, as often as you like:

Download a new CSV, run your R script and analyze your tweets!

My script will row bind together all of the files in your data directory (regardless of their name or how many there are).

And there's a line of code to remove duplicate records, just in case you download multiple files with overlapping date ranges.

Thanks, Dave. The dashboard is excellent!

"It's not where you take things from - it's where you take them to." | Perpetual Evolution

Word Count: 491


  1. "Twitter Analytics Dashboard", Dave Andrade, Tableau Public, February 10, 2015!/vizhome/TwitterAnalyticsforDave/TwitterAnalytics
  2. "How to Steal from the Best", Hanne Løvik, Tableau Public Blog, January 21, 2015
  3. "What's New? Pretty Much Everything", Jason Gorfine, Tableau Public Blog, February 9, 2015
  4. "Replacing a Data Source", Tableau Knowledge Base, October 7, 2014
  5. “It’s not where you take things from – it’s where you take them to”, Devita Villanueva, Perpetual Evolution, April 27, 2013