Quantcast
Channel: Online Behavior - Guide
Viewing all 87 articles
Browse latest View live

Building Google Analytics Powered Widgets

$
0
0
Google Analytics Powered Widgets

There is a lot of useful and interesting data held in your Google Analytics account that could be used to drive content on your site and apps. For example, you might want to show your website visitors what are the most viewed products, or the most viewed articles, or the best performing authors, etc.

In this tutorial I provide a step-by-step guide showing how to create a Top Authors widget using the Google Analytics API, Google App Engine (Python) and Google Tag Manager. You can create a free Google App Engine account that should give you enough allowance to build and use your widget. You can see the end result of this tutorial right there on the right hand side of this site, see "Top Authors" widget.

There are 2 reasons we are using Google App Engine as a proxy instead of just calling the Google Analytics API directly:

  • Avoid exposing any sensitive information held in Google Analytics. Eg. Instead of sharing pageviews we will calculate and share a percentage of the maximum pageviews instead.
  • There is a limit to the number of API calls that can be made and with this method we only need to call the API once a day as we will cache the results. Therefore we don't risk exceeding the API quota; also, as the data is cached, the results will return a lot faster.

The steps below will take you through all the way from creating your app engine project to adding the widget to your site using Google Tag Manager.

  1. Create a New Google Cloud Project
  2. Create Your Google App Engine App
  3. Enable the Google Analytics API
  4. Use Import.io To Scrape Extra Data
  5. Create the Top Authors API
  6. Serve the Widget using Google Tag Manager

1. Create a New Google Cloud Project

If you have not used Google cloud before sign up and create a new project at https://console.developers.google.com. For this tutorial you will be using the free version of App Engine and therefore you do not need to enable billing. Name the project and create a brand friendly project id as this will become your appspot domain, eg. yourbrandwidgets.appspot.com

Google Cloud Project

2. Create Your Google App Engine App

Download the Google App Engine SDK for Python and create a folder on your computer called yourbrandwidgets.

In the folder create a file called app.yaml and add the code below. This is the configuration file and it is important that the application name matches the the project ID created in the first step.

application: onlinebehaviorwidgets
version: 1
runtime: python27
api_version: 1
threadsafe: yes

handlers:
- url: .*
  script: main.app

libraries:
- name: jinja2
  version: "2.6"
- name: markupsafe
  version: "0.15"

In the folder create a file called main.py and add the following code

from flask import Flask

app = Flask(__name__)
app.config['DEBUG'] = True

# Note: We don't need to call run() since our application is embedded within the App Engine WSGI application server.

@app.route('/')
def home():
    """Return a friendly HTTP greeting."""
    return 'Online Behavior Widgets'

@app.errorhandler(404)
def page_not_found(e):
    """Return a custom 404 error."""
    return 'Sorry, nothing at this URL.', 404

Create a file called appengine_config.py and add the following code.

"""'appengine_config' gets loaded when starting a new application instance."""
import sys
import os.path

# add 'lib' subdirectory to 'sys.path', so our 'main' module can load third-party libraries.

sys.path.insert(0, os.path.join(os.path.dirname(__file__), 'lib'))

Create a folder called lib in the main folder.

Download the file called google-api-python-client-gae-.zip from this page.

Unzip the folder and add the 4 folders to the lib folder in your project.
Install the other required libs for Flask by creating a file called requirements.txt and add the following text.

# This requirements file lists all third-party dependencies for this project.
# Run 'pip install -r requirements.txt -t lib/' to install these dependencies in 'lib/' subdirectory.
# Note: The 'lib' directory is added to 'sys.path' by 'appengine_config.py'.
Flask>=0.10

Run pip install -r requirements.txt -t lib/ in the terminal install these dependencies. You should now be ready to test locally. Using the Google App Engine Launcher add the application as described in this tutorial.

Then, select the app as shown in the screenshot below and click run; this will run locally and open a new tab in your current open browser.

Run Widget Locally

If this works as expected you should be able to visit the site on your localhost at the port you set.

You are now ready to deploy this to the cloud. Click deploy and keep an eye on the logs to check that there are no errors.

if successful you can test the the app at yourbrandwidgets.appspot.com.

3. Enable the Google Analytics API

To use the Google Analytics API you will need to enable it for your project. Go to the API portal in the developer console under APIs & Auth and click on the Analytics API as shown in the screenshot below. Then, click on the Enable API button.

Enable Google Analytics API

Get the App Engine service account email, which will look something like yourbrandwidgets@appspot.gserviceaccount.com, under the Permissions tab following the steps shown in the screenshot below and add the email to your Google Analytics account with collaborate, read and analyze permission (learn more about User Permissions).

Google Analytics Permissions

4. Use Import.io To Scrape Extra Data

One issue we had while creating the widget in the sidebar of this site was that the author images and links are not stored in Google Analytics. We therefore have 2 options to overcome this.

Option 1

If you are using Google Tag Manager, create a variable to capture the author image and author urls on each pageview as custom dimensions.

Option 2(the option we will use in this tutorial)

We used import.io to scrape the authors page and turn it into an API that we can use in app engine.

In order to see how this works, go to https://import.io and copy and paste this URL into the box and press try it out. You should see the page scraped into a structured format that you can use by clicking on the Get API button, as shown below.

import.io API

As you can see, the API has a record for each author in a neat JSON format including the 3 pieces of data we needed. The author’s name is under "value", the author’s page link is under "picture_link" and the author’s image is under "picture_image". That really is magic.

We can now create a function in our code that will call the import.io api, extract the 3 data points that we need, cache it for 24 hours, and the returns the result. We can test the result of this by creating an url for this. Update the main.py file with this code. You will notice we have now included some new modules at the top.

import json
import pickle
import httplib2

from google.appengine.api import memcache
from google.appengine.api import urlfetch
from apiclient.discovery import build
from oauth2client.appengine import OAuth2Decorator
from google.appengine.ext import webapp
from google.appengine.ext.webapp.util import run_wsgi_app
from oauth2client.appengine import AppAssertionCredentials
from flask import Flask
from flask import request
from flask import Response

app = Flask(__name__)
app.config['DEBUG'] = True

# Note: We don't need to call run() since our application is embedded within the App Engine WSGI application server.

@app.route('/')
def hello():
    """Return a friendly HTTP greeting."""
    return 'Hello World!'

@app.route('/importioauthors.json')
def importio():
    authors = importioOnlineBehaviorAuthors()

    return json.dumps(authors)

@app.errorhandler(404)
def page_not_found(e):
    """Return a custom 404 error."""
    return 'Sorry, nothing at this URL.', 404

def importioOnlineBehaviorAuthors():

    ob_authors_check = memcache.get('importioOnlineBehaviorsAuthors')
    if ob_authors_check:
        ob_authors_output = pickle.loads(memcache.get('importioOnlineBehaviorsAuthors'))
        ob_authors_output_method = 'memcache'
    else:
        importio_url = "https://api.import.io/store/data/6f4772f4-67ce-4f78-83f3-fa382e87c658/_query?input/webpage/url=http%3A%2F%2Fonline-behavior.com%2Fabout%2Fauthors&_user=ENTER-YOUR-USERID-HERE&_apikey=ENTER-YOUR-API-KEY-HERE"
        importio_url_result = urlfetch.fetch(importio_url)
        importio_result = json.loads(importio_url_result.content)
        importio_author_images = {}

        for row in importio_result['results']:
            name = row['value']
            importio_author_images[name] = {
                    'picture_image': row['picture_image'],
                    'picture_link': row['picture_link']
                    }

        ob_authors_output = importio_author_images

        memcache.set('importioOnlineBehaviorsAuthors', pickle.dumps(ob_authors_output), 86400)


    return ob_authors_output

You can run this locally or deploy to live and then go to yourbrandwidgets.appspot.com/importioauthors.json to test this is working.

5. Create the Top Authors API

The code shown below will authenticate and call the Google Analytics API using the App Engine service account email we added earlier. As you will see in the API request below, we are getting Unique Pageviews for the top 20 authors from the past 30 days. The code then stitches the import.io data to the Google Analytics data so that we have the author images and links ready to be used.

The results are cached for 24 hours so that the API is only called once a day for all users and returns the data in the callback function name we define when calling the URL.
Add the following code to your main.py file above the line of code @app.errorhandler(404)

@app.route('/topauthors.jsonp')
def topauthors():
    # Get the callback function name from the URL
    callback = request.args.get("callback")

    # Check if the data is stored in the cache (it resets after 24 hours)
    output_check = memcache.get('gaApiTopAuthors')

    # If yes then used the cached data in the response
    if output_check:
      output = pickle.loads(memcache.get('gaApiTopAuthors'))

      # If no then request the Google Analytics API
    else:

      # Authenticate and connect to the Google Analytics service
      credentials = AppAssertionCredentials(
      scope='https://www.googleapis.com/auth/analytics.readonly')
      http = credentials.authorize(httplib2.Http(memcache))
      analytics = build("analytics", "v3", http=http)

      # Set the Google Analytics View ID
      view_id = '32509579'

      # Set the report options
      result = analytics.data().ga().get(
        ids='ga:' + view_id,
        start_date='30daysAgo',
        end_date='yesterday',
        dimensions='ga:contentGroup2',
        metrics='ga:uniquePageviews',
        sort='-ga:uniquePageviews',
        filters='ga:contentGroup2!~Online Behavior|admin|(not set)|Miklos Matyas',
        max_results='20'
        ).execute()

      # Get the authors extra data
      authors = importioOnlineBehaviorAuthors()

      # Loop through the results from Google Analytics API and push into output only the data we want to share publicly
      output = []
      max_unique_pageviews = float(result['rows'][0][1])

      for row in result['rows']:
        author = row[0]
        unique_pageviews = float(row[1])
        perc_of_max = str(int(100*(unique_pageviews/max_unique_pageviews)))

        # Only push the author if their image and link exist in the import.io API
        if (author in authors):
            output.append({
              "author":author,
              "perc":perc_of_max,
              "image":authors[author]['picture_image'],
              "link":authors[author]['picture_link']
              })

      # Save the output in cache for 24 hours (60 seconds * 60 minutes * 24 hours)
      memcache.set('widgetTopTenAuthors', pickle.dumps(output), 86400)

    # Create the response in the JSONP format
    jsonp_callback = callback+'('+json.dumps(output)+')'

    resp = Response(jsonp_callback, status=200, mimetype='application/json')
    resp.headers['Access-Control-Allow-Origin'] = '*'

    # Return the response
    return resp

You will not be able to test this locally as it accesses the Google Analytics API so you will have to deploy to App Engine to see the output.

If it is all working as expecting you should see the result by directly accessing the URL in the browser. eg. http://yourbrandwidgets.appspot.com/topauthors.jsonp?callback=anyFunctionName

You can check for any errors in the developer console under Monitoring > Logs. Select App Engine and click the refresh icon on the right to see the latest logs for every time a URL is requested.

Developer Console Logs

6. Serve the Widget using Google Tag Manager

Using the API we just created, which returns the top authors data, we can add a custom HTML tag to Google Tag Manager that will loop through the results and (using a bit of HTML and CSS) output the results in nice looking widgets, complete with bar charts based on the percentage of the maximum Pageviews we calculated server-side.

You will want to design the widget first, and a tip is to try and reuse as much of the current CSS styles for the website.

a) Add the widget code as a tag

Add the following code to Google Tag Manager as a Custom HTML tag.

<script>
// create a function that will be called when the API is called
function topAuthorsCallback(data){
    // dos something with the data that is returned

    // append any new CSS styling to the head tag
    $('head').append(
    '<style>' +
    '.gawidget-author { float: left; width: 100%; }' +
    '.gawidget-author-img { width: 40px; float: left; }' +
    '.gawidget-author-chart { display: inline-block; vertical-align: top; width: 85%; height: 40px; margin-bottom: 5px; }' +
    '.gawidget-author-bar { height: 60%; background: #62B6BA; }' +
    '.gawidget-author-name { height: 40%; padding: 2px 5px; color: #666;}' +
    '</style>' )

    // Create a new div for where the widget will be inserted
    $( '#block-block-18' ).before(
     '<div id="block-top-authors-0" class="clear-block block"><h2>Top Authors</h2></div>' );

    // Create a header for Social links for consistency
    $( '#block-top-authors-0' ).after(
     '<div id="block-social-0" class="clear-block block"><h2>Social Links</h2></div>' );

    // loop through the first 5 results to create the widget
    for (var i = 0; i < 5; i++){

        var authorName = data[i]['author'];
        var authorUrl = data[i]['link'];
        var authorPerc = data[i]['perc'];
        var authorImg = data[i]['image'];
        var authorPosition = i + 1;

        var html_output = '<div class="gawidget-author">' +
        '<a href="' + authorUrl + '">' +
        '<div class="gawidget-author-img">' +
        '<img src="' + authorImg + '" style="width: 100%;">' +
        '</div>' +
        '<div class="gawidget-author-chart"><div class="gawidget-author-bar" style="width: '+ authorPerc +'%;"></div>' +
        '<div class="gawidget-author-name">' + authorName + '</div>' +
        '</div></a></div></div>'

        $(html_output).hide().appendTo('#block-top-authors-0').fadeIn(2000)

    }

}

// The URL for the API on App Engine
var api_url = 'http://onlinebehaviorwidgets.appspot.com/topauthors.jsonp'
// The function created that will add the widget content to the site
var callback_function = 'topAuthorsCallback'
// Join the above to create the final URL
var url = api_url + '?callback=' + callback_function

// Call the jsonp API
$.ajax({
    "url": url,
    "crossDomain":true,
    "dataType": "jsonp"
});
</script>

b) Create a variable and trigger

In this example, we will be adding the new widget right above the Google+ widget so first we create a Custom JS variable that returns true if the div element holding the Google+ exists and false if it does not, as shown below.

GTM Variable Trigger

c) Custom JS Variable - Google Plus Widget Exists

function(){
  if ($( '#block-block-18' ).length > 0){
    return true
  } else {
    return false
  }
}

d) Preview and Publish the widget

Set the tag to trigger on all pages where the div we are appending the widget to exists, as shown below.

Publish Widget

Save the tag and before you publish, go into preview mode to test that the tag is triggering as expected and the widget is appearing as you have designed. If you are happy it is all working you can publish the tag and launch the widget to all your users.

Your Turn To Create A Widget

This is just one simple example of what is possible and we would love to see what you create. How about sharing your top 10 products based on sales or top performing brands or categories. The possibilities are endless!

image 
Google Cloud Project
Run Widget Locally
Enable Google Analytics API
Google Analytics Permissions
import.io API
Developer Console Logs
GTM Variable Trigger
Publish Widget
Online Behavior
Online Behavior

Measuring The Full Customer Journey

$
0
0
Measuring The Full Customer Journey

The idea of measuring the full customer journey has been around for quite a while, and we have seen solutions that partially solved this challenge, but within the boundaries of click-only traffic or paid-only channels.

Historically it has been difficult to integrate ad views without clicks, generic traffic channels (direct, organic, referral) and cross-device journeys within one holistic view. With the integration of Google Analytics Premium and the DoubleClick Campaign Manager (DCM) all of this is available NOW.

All touchpoints included in the channel path

Once you integrate Google Analytics Premium and DoubleClick Campaign Manager, when you take a closer look at the Google Analytics Multi-Channel-Funnels (MCF) path analysis report, you will notice some special features, as seen in the screenshot below.

Multi-Channel-Funnels path analysis report

All touchpoints, where users have viewed a display ad on the journey to conversion (even without clicking on it), are marked with the eye icon. Bear in mind that in this case we do not only measure classic display ad views, but also email newsletters that have been opened, but not clicked. This leads to insights as illustrated in the following path.

Measuring Ad Views

In this scenario the newsletter was opened, but obviously didn't attract enough attention to yield a click. Later on, a display ad supported the customer journey, which led to a reopening of the email, a click within the email and lastly a conversion.

Look around for articles on how to gain insights from attribution modeling leading to campaign and media optimization. Typical questions are:

  • How do my e-mail campaigns support my revenue?
  • Which other channels get support from email?
  • What other channels are necessary for email to perform well?
  • How do display views influence the path length?

Path length and time with and without view attribution

Having only partial insights (and data) from the customer journey leads to wrong assumptions and decisions. Here is the comparison of the path length for a specific goal, with and without taking the view contacts into account.

Cumulated Conversions

Without taking the views into account, we would think that we need fewer interactions than we actually do. This assumption is made even worse when we look into the assists. In the screenshot below, the views are NOT included. Here we see 42 assisting clicks to the conversion and the assisted conversion value of €2,285.

Path length with no views

When we look into the same data WITH attribution of the ad-views, we see the ads assisting the goal 895 times - 20 times more! And the assisted conversion value is €33,716, which is €31,431 higher than in the previous screenshot!

Path length including ad views

Without attributing the views, the display channel seems worthless. A wrong conclusion would be to decrease media spend for that channel and shift it to the "performance" channels. This could result in a loss of awareness in the upper funnel, which could lead to lower conversions and ROI: a fatal error for most businesses out there.

But what is the value of a view?

We often get asked: "But has a simple ad view the same value as a click?" This is a valid question, as a click is a clear indication of interest, whereas a view is only the technical delivery of an ad to the user's browser; we cannot determine whether it has really been seen or whether it was of any interest.

This is why the Google Analytics attribution modeling engine offers us the option to customize its models in multiple ways. In this article I want to emphasize the adjustments regarding ad views:

As shown in the screenshot below, we can define that

  • in general a view should only be attributed 50% of the value of a click (0.5)
  • however, if a view is followed by a click within 10 minutes, it should be counted as 150% (1.5)
  • and if a click did not lead to conversion directly but to a visit with a higher user engagement (time on site > x:xx), it should be valued at yyy%

Create attribution model

Conclusion

As we all know, the future of marketing is data driven. Having only fraction of the data leads to suboptimal and sometimes inaccurate decisions. The 360 integration of all channels (click AND view interactions) and a successful cross-device measurement (online AND offline) is the key to success. Google Analytics Premium and DoubleClick Campaign Manager offer a supreme solution for that with the power of data in a well known user interface. That enables businesses gaining the right insights for the best strategies.

This article is also available in german: Full customer Journey Analyse mit Analytics und DoubleClick

image 
Multi-Channel-Funnels path analysis report
Measuring Ad Views
Cumulated Conversions
Path length with no views
Path length including ad views
Create attribution model

Running Semi-Automated Tasks on Google Analytics

$
0
0
Semi-Automated Google Analytics Tasks

Tired of manually clicking through the Google Analytics interface to retrieve data? Want to take advantage of the Google Management API, but the functions you need just aren't available? Well then, let me share with you my temporary solution and long-term vision for creating an easy interface for such tasks.

As a web analytics implementation consultant at Cardinal Path, I work extensively with Google Analytics and Google Tag Manager. The standard approach that I often take before attempting any form of tagging or code changes is to take a peek at the client's existing configuration and data. This initial audit checks for data integrity, overlooked resources, and implementation practices.

My role offers me the opportunity to work with enterprise-level clients with hundreds of views. This means that I may end up manually clicking tens of thousands of times with a high likelihood for repetition when making a global change whether for maintenance or to resolve an issue.

As a web programmer, I refuse to let the web take advantage of me and thus started my hunt for an automatic and robust solution for repetitive tasks within Google Analytics.

A Sample use case: Adding Annotations to multiple views

A recent client audit revealed that the Annotations feature isn't being used consistently, if at all, during certain time periods. This is a powerful Google Analytics feature that allows anyone who is analyzing data within the reports to realize the reasons for possible spikes in data.

The possible overlook of this feature does not come as a total surprise for this particular client due to the sheer amount of views that are being maintained. As result, we would like to offer insights on how to carry out a possible solution as well as how to approach similar tasks in the future.

Using Google Chrome Console

In the aforementioned use case, the client needs a way to start annotating, and often, the same annotations are required for multiple views / properties. A quick way to achieve this is to automate a series of manual tasks in succession in the browser console. In the case of Chrome, I rely on my handy dandy chrome inspector (shortcut: CTRL + SHIFT + J on Windows, or CMD + OPT + J on Mac).

Before you drop the code provided below into your Chrome console, ensure you are inside the Google Analytics reporting interface and find the starting view by searching for its property in the dropdown menu at the top (as highlighted in the green box in the screenshot below). This step ensures the code runs through the list of views following and including the selected view for that property (the code stops running once it finds a new property).

Google Analytics Semi Automated Tasks

If I select the view MyLabel - David Xue -liftyourspirit david.com in the screenshot above, the code will affect all the views down to the last view in this property, MyLabel - David Xue - Youtube.

Running the semi-automated Google Analytics tasks

You may run the code after pasting it into the console tab by pressing ‘enter.'Caution: this will add annotations to your views, so be sure you are in a test property with test views, or just delete them manually afterwards (better yet, modify this current code for deletion across the property).

count = 0;
start = setInterval(function() {
    var current_view = $('li[aria-selected=true]');
    setTimeout(function() {
        setAnnotation();
        if (current_view.next()[0] == undefined) {
            console.log('Swept ' + count + ' views');
            clearInterval(start);
        }
    }, 2000);

    setTimeout(function() {
        getNextViewInReporting();
        count++;
    }, 6000);
}, 8000);


getNextViewInReporting = function() {
    $('._GAx5').click();
    var current_view = $('li[aria-selected=true]');
    current_view.next().click()
}

setAnnotation = function() {
    var view = $('.ID-accounts-summary-1').text()
    var date = 'Sep 13, 2015' //Enter the date
    var annotation = "A test annotation" //Enter the annotation
    var visibility = "shared" //Default is shared, other is private
    $('#AnnotationDrawer_wrapper').css('display', '');
    setTimeout(function() {
        $('a._GAkAb._GAlo')[0].click()
        $('input[name="date"]').val(date) //"Sep 17, 2015"
        $('._GATb._GACm').find('tbody textarea[name="text"]').val(annotation)
        $('._GATb._GACm').find('tbody textarea[name="text"]').click()
        if (visibility == 'private') {
            $('#AnnotationsDrawer_private_radiobutton').prop("checked", true)
        } else {
            $('#AnnotationsDrawer_public_radiobutton').prop("checked", true)
        }
    }, 1000)

    setTimeout(function() {
        $('._GATb._GACm').find('form').find('a._GAE._GADq b b b').click();
        console.log(count + ' view: ' + view + ' added annotation: ' + annotation)
    }, 1800);
}

The actions that soon follows mimic how we would go around to manually produce the annotation. Upon the scheduled eight-seconds interval, the code will first ensure the drawer near the timeline is open, so the ‘Create new annotation' object is exposed. Then we click this object in order to fill the expanded form with pre-filled data from our code. Lastly, we click on the save button before the process is repeated until the last view in the property.

Please note that the code works as of October 2015, but Google may change their HTML markups so make sure to test the code before you use it..

Further Development

By modifying the code provided above you can enlarge the use cases to include the following (but the sky's the limit!):

  1. Check for campaign data
  2. Check for social network data
  3. Check and edit view configurations
  4. Check and edit most of what is provided in the Google Management API; reason for this duplicate method is that there is quota limit, which you can reference here
  5. Check, create and edit calculated metrics

Since we can semi-automate so much without relying on the Google Management API, my next step would be to create a long term solution in the form of a plugin that will automate and provide a quick summary of the general audit we typically perform at Cardinal Path. Note that Cardinal Path provides a more in-depth and personalized audit with our current clients, so definitely reach out if you would like to learn more.

image 
Google Analytics Semi Automated Tasks

Bulk Analytics Configurations with Google Sheets

$
0
0
Google Sheets Add-On for Google Analytics Configurations

Google Analytics is known for its simple, turnkey approach to getting started with analyzing your traffic data. Just drop the snippet on your page and GA does the rest! Right?

...right?

Ok, fine. While there are plenty of Google Analytics users who manage just one property, a growing chunk of the analytics market is composed of large companies with complex account structures with dozens upon dozens of properties. Managing things like filters, channel groups, goals, and custom dimensions across all of these entities is far from trivial, and time-intensive at best.

Take custom dimensions, for instance. Imagine having to edit custom dimensions across, say, 20 properties. You'll be moving from property to property, drilling into each dimension setting and waiting for the save to happen before moving to the next property. In short, you'll be clicking around the Admin section of the interface for quite a while!

Custom Dimensions settings

As it happens, most of the people in these large companies with complex requirements are often unable to get through the day without opening up a spreadsheet. In fact, more often than I'm comfortable admitting, people have told me that what they want isn't necessarily a robust tool; they just want a tool that will allow them to get the job done, and ideally it would be in a spreadsheet because that's where they live.

"But spreadsheet add-ons aren't robust enough for enterprise software."

Me: "Yeah, well, that's just like - you know - your opinion, man."

Look, enterprise data is messy. You have to manage it in a way that is sustainable and flexible, but getting it done is better than not. And for better or worse, in large organizations, people live in spreadsheets. By providing tools that let people get their jobs done with as gradual a learning curve as possible, larger organizations will be better positioned to use powerful features of Google Analytics in a way that would otherwise be too cumbersome to consider.

So, for this growing group of users, wouldn't it be great if they could manage the configuration of their custom dimensions in a spreadsheet, copy and paste the configurations across multiple properties (maybe with some slight differences), and then upload the whole thing back into Google Analytics?

Custom Dimension spreadsheet

Google Analytics Management API and Custom Dimensions

The Google Analytics Management API can be used to manage many things, including common entities such as custom dimensions. The API can be accessed through a Google Sheet using Apps Scripts, and Google Sheets automatically handles authentication for Google APIs. This makes Sheets add-ons a convenient way to distribute functionality for important business processes such as listing and updating custom dimension information in the tabular form to which spreadsheet users everywhere are accustomed. In fact, the API was built with the expectation that users would develop their own ways of accessing and processing their data.

One of the configurations that the Management API enables you to manage from your own systems (as opposed to the Google Analytics interface) is the Custom Dimension feature. This is an important feature that allows you to add custom data to the information Google Analytics is automatically getting for you. For example, you can add a dimension to capture:

  • The type of users (silver, gold, platinum)
  • The level of engagement in the current session (maybe based on scroll percentage)
  • The name of the author on an article page

If you do not use this feature, take a look at these 5 questions.

But when you're a marketing organization with limited engineering resources, who is going to write a robust tool to manage these entities at scale in a way that is easy to use and gets the job done?

Your friendly neighborhood Googler, that's who!

Working with Custom Dimensions in Google Sheets

With that in mind, I rolled up my sleeves and started working on an add-on that would help users manage their custom dimensions in a more robust and organized way, while using an interface they are comfortable with. What I came up with can be found in the add-on store: GA Management Magic

Below I provide a step-by-step guide on how to use the add-on to manage your custom dimensions, but if you are the video type of person, you can also see me using the add-on in the following screencast.

1. Install the GA Management Magic add-on

The add-on is available through the add-on store

2. Listing Custom Dimensions

To list custom dimensions from a property, run the List custom dimensions command from the add-on menu (see screenshot below). Enter the property ID from which to list custom dimension settings into the prompt.

Google Analytics configuration add-on

A new sheet will be added, formatted, and populated with the values from the property. You're welcome!

3. Updating Custom Dimensions

To update custom dimension settings within 1 or more properties, run the Update custom dimensions command from the add-on menu (see screenshot above). Enter the property IDs (separated by commas) of the properties that should be updated with the custom dimension settings in your sheet.

The properties listed in the sheet will be updated with these values. Neat, right?

If you have not named the range(s) as described above, the script will format a new sheet for you into which you can enter your custom dimension settings. It is also recommended that you not update blank values into the property as it may result in undesirable behavior.

The code for this add-on is available on GitHub. Feel free to grab, improve and share!

image 
Custom Dimensions settings
Custom Dimension spreadsheet
Google Analytics configuration add-on

Measuring Page Velocity with Google Analytics

$
0
0
Page Velocity

This article was contributed by Bill Tripple, Senior Consultant of Digital Intelligence and Alex Clemmons, Manager of Analysis & Insights, Digital Intelligence, both from the award winning digital data analytics firm, Cardinal Path.

One of the most basic questions asked by marketers is: "How much content is being consumed on my website?" Traditionally, content consumption has been measured through the use of pages per session (total pageviews / total sessions). This metric has served as a simple barometer for content consumption for many years, but it has its limitations: namely that we can't drill down to easily see which page(s) on the website are actually driving additional content consumption vs. which pages are just the most popular.

Enter Page Velocity, a custom built metric within Google Analytics that allows us to drill deeper and understand which pages have the greatest influence in driving users deeper into a website. Through the use of page velocity we have the ability to see that Page A drives, on average, five additional page views, whereas Page B only drives three. Perhaps we optimize some of our landing pages to take elements of Page A and test them on Page B. Or divert media traffic from one page to another.

As you can imagine, this metric can be very useful for content-driven websites that depend on advertising dollars, as we can now look into which pages are driving the highest ROI (propelling users deeper into the website)

Defining Page Velocity

The basic principle of page velocity is as follows: Page Velocity = (number of pages seen after the current page is viewed / unique page views to the current page)

The following example measures page velocity from three sessions:

Page Velocity Calculation

Page Velocity Values

From here you can start to see how this comes together. Page A is tied with page B for the highest Page Velocity. Both were seen within two sessions and both drove a total of 9 additional page views. Page G is on the low end with a velocity of 0. It was seen within two sessions but did not drive any additional page views.

Of course you can also start to imagine where this metric has its flaws in that some pages will have a low Page Velocity by design (like the thank you page of a form, for example). Understanding the purpose of each page will be critical for successful analysis using this metric.

Using Page Value to Measure Page Velocity

The measurement of Page Velocity takes advantage of the Page Value metric within Google Analytics. Before diving into specifics, it's important to understand how Google Analytics evaluates Page Value. In Google's official documentation they provide an example where page B would receive a page value of $110 ((10+100) /1 session):

Google Analytics Page Value

To measure Page Velocity, we will need to send an Ecommerce transaction with an arbitrary value of $1 on every pageview so it receives credit for the future pageviews as well.

If you are already using Ecommerce, this will inflate your real Ecommerce metrics / reports, so we recommend that you create one view specific for page velocity, and filter out these Ecommerce transactions in your remaining views. Additionally, you'll want to filter legitimate Ecommerce transactions from your Page Velocity view.

Using this trick, we can now answer the question of which pages on your website are actually driving visitors the deepest. The next step is to segment your visitors, to further optimize existing and future content. Below you will find a pages report showing our Page Velocity metric via the Page Value column.

Page Velocity Report

Even though we're seeing a dollar value, this really represents the velocity. So the page in row 10 of the above screenshot drives an average of 5.27 additional pages.

Going a level deeper with segmentation

The next step is to segment your visitors, looking at page velocity by different source/mediums, by custom dimensions, etc. You can find two useful examples below.

1. Pages segmented by "Medium" with a secondary dimension

Segmenting Page Velocity

2. Pages segmented by "Landing Pages" with a secondary dimension

Landing Page Velocity along side bounce rate can be very telling. For example, a page with a low bounce rate but also a low Page Velocity may be one to look deeper into.

Landing Pages Velocity

Additional use cases

At first glance, Page Velocity may seem to be useful only for content-heavy sites. But this is not the case. We have found that marketers across verticals have an appetite for this metric including in Ecommerce, Banking, Finance, Higher Education, Non-Profit and many others using it in various capacities. One major benefit of the metric is that it is completely customizable to the needs of the website and stakeholders utilizing it.

For example, on an Ecommerce site you may want to understand which products are driving additional research. Tailoring Page Velocity to only be captured on research focused pages (say a product detail page and a product gallery page) would allow us to tailor this metric to be something like Product Research Velocity.

About the Authors

Bill TrippleBill Tripple is a Senior Consultant of Digital Intelligence at Cardinal Path, his area of expertise includes analytics implementations for both Google and Adobe products, and he is convinced that he can track nearly anything. He is certified in Google Analytics, Adobe Analytics, and Eloqua. His current and past experiences in the Digital Marketing and the development industry has given him a competitive edge to easily associate and relate with both Marketers and Developers. Learn more about his professional experience on LinkedIn.

Alex ClemmonsAlex Clemmons is a Manager of Analysis & Insights, Digital Intelligence. He leads day-to-day operations across multiple clients at the award winning digital data analytics firm, Cardinal Path. He has a passion for finding meaning in mountains data and will jump at any opportunity to help clients apply their data to drive results. His area of expertise include measurement strategy, testing and deep dive analysis focused on identifying areas for optimization. Learn more about his professional experience on Linkedin

image 
Page Velocity Calculation
Page Velocity Values
Google Analytics Page Value
Page Velocity Report
Segmenting Page Velocity
Landing Pages Velocity
Bill Tripple
Alex Clemmons

Explaining Google Analytics to Your Boss

$
0
0
Explaining Google Analytics

Opening Google Analytics for the first time can be overwhelming. Graphs and reports, never-ending menus, and configuration settings that you may or may not need to know about; it's all there waiting for you. Are you prepared to speak confidently about what you see in your Google Analytics?

Generally speaking, you'll find two main types of articles about Google Analytics: setup and reporting. Setting up the tracking on your website starts easily enough, but can quickly take on barnacles as you encounter challenges with your particular site, third-party vendors, and multiple systems, just to name a few. Reporting seems like it should be much simpler, everyone gets the same set of reports – your reports just have data about your website, assuming proper setup.

In practice, reporting brings its own unique difficulties. Even if you didn't set up the tracking on your site, you still need to understand how the data is collected and processed to understand the data you've been tasked with interpreting.

This guide is meant to cover the basics about how Google Analytics works, what the numbers actually mean, and how you should begin to report on it. If you've used Google Analytics extensively, forgive me for the review, but I feel it's worth re-familiarizing yourself with the core concepts and definitions, if only to solidify your understanding.

Setting Up Our Google Analytics

Let's start at the very beginning. Inside the GA interface, you first create an Account. This will usually correspond with your company name. Inside the Account, you create separate Properties for each website that you own that you'd like to track.

Each website gets assigned its own Property ID, which is how GA will keep your data organized. This ID looks something like UA-XXXXX-YY. Think of the Property like an email inbox and the ID like the email address. You send data to this particular ID and GA collects it for us.

Underneath Properties are Views, which are different ways to view the data that was collected in the property. In your email inbox, you may sort emails into different folders or use tags to identify them. Similarly in GA, we can sort data into different Views so that we can easily look at a smaller section of data. All of the data lives at the Property level, Views are just different pre-sorted or pre-formatted ways to look at it. Here is an example from the Google Analytics Help Center.

Google Analytics Hierarchy

How Does Your Data Get To Google Analytics?

That's it for the interface! Let's talk about how you send data to the right place. GA automatically generates a little bit of JavaScript for you that you then need to place on every page of your website. How you add this to your website is very specific to how your site was built. There are a number of ways that accomplish this like plugins for popular sites like Wordpress and systems like Google Tag Manager that make it easier to add Google Analytics to your site.

Once the code is installed correctly (we won't cover that here), it will immediately start sending data to GA. Again, let's go back to the email address metaphor. The code automatically sends small pieces of data to GA to track what pages are loaded and information about the person and browser loading the page. These pieces of data are called hits and are sent to that unique UA-XXXXX-YY ID that is specific to your site. Here is how to find your tracking code.

This isn't, however, like a video where GA can see someone's mouse moving around on the screen. By default, information gets collected only when the page is loaded and the hit is sent to GA. This is important when we get to how metrics are defined below.

Potential Tracking Challenges

Let's be upfront about a few issues that may occur. Due to the nature of this type of tracking (called client-side tracking) these numbers will never be 100% complete. Google Analytics isn't all-knowing and all-seeing. There are a number of things that may affect data from ever being sent to GA, but these usually only impact a small percentage of your traffic.

Depending on how your particular users access the site, there are several reasons why the data may not be sent to GA. JavaScript is required to execute the GA code that tracks users accessing your site. This feature can be turned off in a particular browser by a person or the company that owns the device, though arguably many pages on the internet would flat out stop working without JavaScript.

For a typical implementation, GA also requires the ability to store first-party cookies on the person's device. First-party cookies are small pieces of data that are stored on the user's computer to help remember if they've been to the site before. This is how GA determines New vs Returning Users. Generally, first-party cookies are considered trustworthy - they can only be accessed by the page that set the cookie. If there's an issue with cookie storage, several metrics will be severely affected.

There are technologies out there to help block advertising and web tracking like Google Analytics. There are many motivations why someone would actively try to block Google Analytics, I've heard everything from privacy concerns to data usage. Unfortunately, there's not much you can do from a technical perspective. If someone blocks Google Analytics from loading properly, then they won't be included in your Google Analytics data. This goes for regular site traffic, as well as ecommerce and other important conversions you may have defined.

Then there's the implementation – it's amazing how often people accidentally install the same tracking more than once or make other small mistakes with big impacts. It should go without saying, but if Google Analytics isn't set up properly, the numbers you get back won't be accurate.

Types of Reports

There are four main categories of reports in Google Analytics and they answer very different questions.

Audience

These reports tell you about the users that accessed your website. More specifically, they'll tell you everything that can be gleaned just by the user arriving on your site. What type of device are they on, have they been here before, where are they coming from geographically. Note that you won't see personal information here. (We're back to Google Analytics not being omnipotent!) There are reports called Demographics and Interest reports that can be turned on, but those are using Google's best guesses and estimates for people's Gender, Age, Interests and Affinities.

Acquisition

These reports help determine how someone arrived at your site. Without going into too much detail, this is determined by looking at what was the previous page someone was on before they arrived on your site. Users are placed into different channels like Organic, Paid Search, Social, and Referral.

Behavior

This section focuses on what users did on your website. By default, you get information about what pages people look at – which landing pages are most popular, how long people spend on specific pages, etc. With some setup, you can also see other actions that take place on your site like downloads and site search.

Conversions

This last group of reports is your chance to tell Google what's most important to you. You can define certain pages or actions that you hope visitors to your site will accomplish, and then be able to see how often those conversions occur. You can also track ecommerce purchases and related actions. This section requires configuration to make it specific to your site.

What Should You Report On?

So you've got your site tracking set up properly and you can see the data flowing into Google Analytics – now what?

It's important to lay out what you'll be tracking and why. Don't get hung up on specifics that are too small to matter. Rather, focus on comparing time periods and trying to identify places for additional research. If traffic went up from last month to this month, where did you see your most growth? Mobile vs Desktop? Were there specific types of content that did better or a particular channel that sent more traffic this month?

Monthly reporting is common but problematic. Keep in mind the complexities that come with months of different lengths. Most website traffic ebbs and flows with the work week, so it would make sense that if one month has more weekdays then it would have more traffic. Holidays throw a wrench into calculations and depending on your industry, so could world news or political events. Often these are the simplest explanations for differences between two months.

We talk often of Measurement Strategies and forward thinking. Try to anticipate decisions you could make with the right data and then create the reports to help influence future decision making. If you're a content or service website, knowing which pages are getting the most traffic can help influence future articles or business opportunities. Traffic performance can influence paid marketing campaigns, and with the right tracking, you can answer questions about how various channels are driving conversions that may influence budgeting discussions.

You can also use reporting to identify potential errors on your website. Hopefully, most errors are due to tracking code issues and easily corrected, but you should also be using reporting to monitor if your site is performing appropriately across browsers and devices. Large spikes or dips can identify areas for future research.

Common Google Analytics Metrics

When you see the term Metrics, think of any sort of number or count in Google Analytics. The most common metrics that we use to report on are listed below.

Users

First, it is important to understand that users do NOT tell us the number of PEOPLE that arrive on our website. In practice, users is a count of the number of unique devices that access our website. Even more specifically, a unique browser on a unique device. Remember those cookies we talked about earlier? Each set of cookies is a different user.

Think about your own digital life – how many computers/devices do you use during the day to access the internet? Work PC vs Home PC? Phone vs Laptop? Each different device counts as a different user. There are ways to get this number to more accurately represent people instead of devices, but that requires additional setup and a situation where you know someone's actual identity (for instance if they log in to your website.)

Typically, this number is higher than it should be. We have more users than actual people visiting the website and some tracking issues will artificially increase this number. People can clear their cookies and get new computers. It's still worth reporting on, but be clear when talking about this particular metric.

Sessions

A session is all of a user's activities on your site within a given time period. If I come to your website and view five pages, that is all grouped into my one session. Remember that GA doesn't have a live camera feed to watch someone browse your site, so there's really no way for it to know when a person leaves your site. It determines that a session is over after a user has been inactive for more than 30 minutes.

Each session gets attributed back to a specific channel in the Acquisition reports, so if someone arrives on our site from Social media or a Google search, all of the activity in that particular session gets credited to that particular channel. If they come back from a different source (or after 30 minutes of inactivity) then a new session is started.

This a great metric to track and report on. We clearly want to see more sessions coming to our site and sessions is a great indicator of activity on the site.

Pageviews

This counts how many pages are viewed on the website, pretty easy right? For general reporting, month over month, it's an OK metric to use to see ups and downs. Keep in mind what it's really measuring though. If you have “hub" pages, like your homepage, where people branch off from and then return to frequently, your pageview numbers will go up, but you haven't necessarily increased value from those extra pageviews.

Typically, this number is higher than it should be, because it includes multiples views of the same page, even during the same session. Use this as a benchmark month over month or year over year, but for more in-depth analysis, use the Unique Pageviews metric for individual pages.

Avg. Session Duration

One of the most misunderstood metrics, we'd ideally want Session Duration to be just that – how long did users on average spend on our site. Instead, you're reporting how much time we've measured that users spent on our site. It may seem obvious, but it's worth making the distinction. Remember that data is only sent to GA by default on page load. Everything after that page load is a mystery until they visit another page.

Typically, this number is lower than reported. You know people are spending more time on your site, and you can take efforts to help get this number more correct. Think of a digital image, the more data you have, the clearer the picture becomes. You can add events to track engagements like downloads or video plays, which will in turn provide GA with more data to make Avg. Session Duration a more accurate calculation.

Bounce Rate

This metric tells you the percentage of sessions on your site that only completed one action. Typically this means how many people arrived on your site and then left without doing anything else, or “bounced." This metric is extremely helpful for gauging effectiveness of landing pages or from specific channels.

(Sidenote – I rarely give “good" and “bad" numbers for metrics, as every site and industry is unique, but if your bounce rate is very low, it can be a sign that you have a tracking issue. It'd be nice to think that your site has a 5% bounce rate, but most often that's not the case.)

Typically, this number is higher than reported. Again, we have an issue that can be solved with more data. By default, the GA code tracks when pages are loaded only. If someone arrives on your page and leaves after 2 seconds, we want that to be counted as a bounce. If they stay for 10 minutes reading an article and then leave, they'll still be counted as a bounce, because GA has no idea they were there for that long. Adding additional events will not only move your Avg. Session Duration closer to accurate, but it will also help clarify your bounce rate.

Bounces also include in the time on site calculations as a zero, which really will bring down your average.

Taking The Next Step

Reporting through Google Analytics can be a rewarding experience when paired with active decision-making based on those reports. It's important to know how your website is performing, but even more so, the insight gleaned from Google Analytics reports can influence strategy, affect budgets, guide development, and more.

There will always be ways to improve your data. You can collect more data, you can collect better data, and even after all of that, you can address data issues that make reporting more difficult.

Take the time to know what is being tracked and how the numbers are calculated to help clarify your own understanding of Google Analytics and how it can impact your website.

For continued education, there are a number of great options out there. Google's Analytics Academy is a great (free!) option for those who enjoy self-guided learning. Our company, as well as many others, offer in-person trainings around the world, covering everything from beginner-level reporting to advanced implementation and Google Tag Manager setup.

Of course, there's no better way to learn than to start doing. It takes a company-wide commitment to identify the decisions that can be helped with data from Google Analytics and then to put it into practice.

image 
Google Analytics Hierarchy
Online Behavior

Funnel Analysis with Google Analytics Data in BigQuery

$
0
0
Funnel Analysis with Google Analytics Data in BigQuery

Conversion funnels are a basic concept in web analytics, and if you've worked with them enough, you may have gotten to a point where you want to perform a deeper analysis than your tools will allow.

"Which steps in my funnel are being skipped? What was going on in this funnel before I defined it? Which user-characteristics determine sets of segments across which progression through my funnel differs?" These questions can be answered using the solution described in this article. In particular, I'm going to talk about how to use BigQuery (BQ) to analyze Google Analytics (GA) page-hit data, though the principles could be applied to any page-hit data stored in a relational database.

The Google Analytics Funnel Visualization report (see below) makes certain abstractions and has certain limitations, and advanced users can benefit through the use of Google BigQuery (BQ) - an infrastructure-as-a-service offering which allows for SQL-like queries over massive datasets.

Funnel Analysis

In this article, we'll discuss the benefits of using BigQuery for funnel analysis as opposed to the Google Analytics user interface. In order to make the solution clear I will go over the basic structure of an SQL query for funnel analysis and explain how to use Funneler, a simple Windows application to automate query-writing. The source code of Funneler is also provided as a Python 3 script. Please note that in order to use the specific examples provided here you will need a Google Analytics Premium account linked to BigQuery (learn more about the BigQuery Export feature).

Funnel Analysis - Google Analytics UI vs. BigQuery

The solution I propose below works as follows: using a Windows application (or Python script) a BigQuery-dialect SQL query is generated which tracks user-sessions through a set of web properties, and optionally segmenting and/or filtering the sessions based on session characteristics. BigQuery's output is a table with two columns per funnel stage: one for session-counts, and one for exit-counts.

Below is a list of the most significant differences between GA Funnel Visualization and the solution I will be discussing.

  1. Loopbacks: If a user goes from steps 1 -> 2 -> 1, GA will register two sessions: one which goes to step 1, one which goes to step 2, and an exit from step 2 to step 1. Our query will only count one session in the above instance: a session which goes from step 1 to step 2. Furthermore, since progress through the funnel is measured by the "deepest" page reached, the above scenario will not be distinguished from a session which goes from step 1 -> 2 -> 1.
  2. Backfilling funnel steps: GA will backfill any skipped steps between the entrance and the exit. This solution will only register actual page-hits, so you'll get real numbers of page-hits.
  3. Historical Information: GA Funnels cannot show historical data on a new funnel, whereas this workflow can be used on any date range during which GA was tracking page-hits on the selected funnel-stage pages.
  4. Advanced Segmentation: GA Funnels don't support advanced segmentation, whereas with Group By clauses in BigQuery, you can segment the funnel on any column.
  5. Sampling: GA Funnel Visualization shows up to 50,000 unique paths, whereas BQ will contain all the page-hits that GA recorded, and allow you to query them all.

The Query

For Google Analytics data, the basis of a funnel query is a list of URLs or Regular Expressions (regex), each representing a stage in the conversion funnel.

If you have a pre-existing funnel in GA, follow the steps below to find your funnel settings:

  1. Go to Admin in GA
  2. Select the correct Account, Property, and View
  3. Go to Goals
  4. Select a Goal
  5. Click Goal Details

In this screen you will find a regex or URL for each step of the funnel. They may look like this: "/job/apply/".

The basic process of writing the query, given the list of regexes or URLs, is as follows:

1. Create a base-level subquery for each regex

For each row which has a regex-satisfying value in the URL column, pull out fullVisitorId and visitId (this works as a unique session ID), and the smallest hit-number. The smallest hit-number just serves as a non-null value which will be counted later. The result sets of these subqueries have one row per session.

SELECT
fullVisitorId,
visitId,
MIN(hits.hitNumber) AS firstHit
FROM
TABLE_DATE_RANGE([<id>.ga_sessions_], TIMESTAMP('YYYY-MM-DD'),
TIMESTAMP('YYYY-MM-DD'))
WHERE
REGEXP_MATCH(hits.page.pagePath, '<regex or URL>')
AND totals.visits = 1
GROUP BY
  fullVisitorId,
  visitId

2. Join the first subquery to the second on session ID

Select session ID, hit-number from the first subquery, and hit-number from the second subquery. When we use full outer joins, we're saying sessions can enter the funnel at any step. To count sessions at each stage that have only hit a previous stage, use a left join.


SELECT
  s0.fullVisitorId,
  s0.visitId,
  s0.firstHit,
  s1.firstHit
FROM (
# Begin Subquery #1 aka s0
  SELECT
        fullVisitorId,
        visitId,
        MIN(hits.hitNumber) AS firstHit
  FROM
TABLE_DATE_RANGE([<id>.ga_sessions_], TIMESTAMP('2015-11-01'),
TIMESTAMP('2015-11-04'))
WHERE
      REGEXP_MATCH(hits.page.pagePath, '<regex or URL>')
        AND totals.visits = 1
GROUP BY
      fullVisitorId,
      visitId) s0
# End Subquery #1 aka s0
FULL OUTER JOIN EACH (
# Begin Subquery #2 aka s1
SELECT
    fullVisitorId,
    visitId,
    MIN(hits.hitNumber) AS firstHit
  FROM
TABLE_DATE_RANGE([<id>.ga_sessions_], TIMESTAMP('2015-11-01'),
TIMESTAMP('2015-11-04'))
WHERE
REGEXP_MATCH(hits.page.pagePath, '<regex or URL>')
  AND totals.visits = 1
GROUP BY
      fullVisitorId,
      visitId) s1
# End Subquery #2 aka s1

ON
  s0.fullVisitorId = s1.fullVisitorId
  AND s0.visitId = s1.visitId

3. Join the third subquery to the result of the above join on session ID

Select session ID, hit-number from the first subquery, hit-number from the second subquery, and hit-number from the third subquery.

4. Join the fourth subquery to the result of the above join on session ID

Select session ID, hit-number from the first subquery, hit-number from the second subquery, hit-number from the third subquery, and hit-number from the fourth subquery.

5. Continue until all subqueries are joined in this way

6. Aggregate results

Instead of a row for each session, we want one row with counts of non-null hit-numbers per funnel-step. Take the query so far, and wrap it with this:

SELECT
  COUNT(s0.firstHit) AS _job_details_,
  COUNT(s1.firstHit) AS _job_apply_
FROM (
  (query from 2. goes here if the funnel has two steps))

The query has a recursive structure, which means that we could use a recursive program to generate the query mechanically. This is a major advantage, because for longer funnels, the query can grow quite large (500+ lines for a 13-step funnel). By automating the process, we can save lots of development time. We'll now go over how to use Funneler to generate the query.

Funneler

Funneler is an executable Python script (no need to have Python installed) which, when fed a json containing a list of regexes or URLs, generates the SQL query in the BigQuery dialect to build that funnel. It manipulates and combines strings of SQL code recursively. It extends the functionality of the query described in section 2 and it allows for segmenting and filtering of sessions based on any column in the BigQuery table.

Funneler and funneler.py can be found on my Github page (https://github.com/douug).

The input to Funneler is a json document with the following name/value pairs:

  • Table name, with the following format: [(Dataset ID).ga_sessions_]
  • Start date: ‘YYYY-MM-DD'
  • End date: ‘YYYY-MM-DD'
  • List of regexes: one regex per funnel-step
  • Segmode: True for segmenting, False otherwise
  • Segment: The column to segment on
  • Filtermode: True for filtering, False otherwise
  • Filtercol: The column to filter on
  • Filterval: The value to filter on in the above-mentioned column

Here is an example of an input json:


{
  "table": "[123456789.ga_sessions_]",
  "start": "'2015-11-01'",
  "end": "'2015-11-04'",
  "regex_list": ["'/job/details/'",
        "'/job/apply/'",
        "'/job/apply/upload-resume/'",
        "'/job/apply/basic-profile/'",
        "'/job/apply/full-profile/'",
        "'/job/apply/(assessment/external|thank-you)'"],
  "segmode": "True",
  "segment": "device.deviceCategory",
  "filtermode": "False",
  "filtercol" : "hits.customDimensions.index",
  "filterval" : "23"
}

Please note the quoted quotes (e.g. in the elements of the value of the key "regex_list" above). These are included because after the json is ingested into a Python dictionary, the Python strings may contain SQL strings, which themselves require quotes. But, the value of the key "filterval" has no inside quotes because 23 is of type int in SQL and wouldn't be quoted.

To run Funneler, go to \dist_funneler\data. Open input.json and modify the contents, then go back to \dist_funneler and run funneler.exe. Three files should appear - std_error.log, std_out.log (which contains feedback about whether Segmode or Filtermode are engaged, and where the generated query can be found), and your query. Copy and paste your query into BigQuery. Try starting with a short funnel, as it may take a few tries to format the input correctly.

Alternatively, if you are running funneler.py, it can be executed from the command line with the following:

python funneler.py input.json

In this case, the contents of the above mentioned std_error.log and std_out.log files will appear in-console. This query can then be copied into your BQ instance. The resulting table should have two columns per regex/funnel-step - one for hits, and one for exits - and one row . If segmode is set to True, then there will be a row per value in the segment column.

Hopefully these tools help you to quickly create complex queries and meet analysis objectives to perform deeper analysis of GA page-hit data.

image 
Funnel Analysis

DoubleClick for Publishers & Google Analytics Premium

$
0
0
DoubleClick for Publishers and Google Analytics Premium

Recently the Google Analytics team released two new integrations: DoubleClick for Publishers (DFP) for Google Analytics Premium and DoubleClick Ad Exchange (AdX) for all Google Analytics users. While these new integrations were not widely publicized, I believe they are major game changers, they effectively embrace Publishers as first-class citizens, providing a robust solution to measure and optimize ad supported websites.

Up to July 2015, Google Analytics provided only one integration for publishers, with AdSense (did you read my book?), where they could analyze AdSense effectiveness and find insights to optimize results. If you were serving only AdSense ads on your site, it would work well with the Google Analytics integration, so you were all set if you were using just that.

However, DoubleClick for Publishers (DFP) is a widely used solution to serve Direct Deals, AdSense and DoubleClick Ad Exchange (AdX). That's where the DFP integration enters the scene. Before this integration, only AdSense metrics would be available, but since July 2015, you can report Ad Exchange (for all Google Analytics users) and DFP (Google Analytics Premium only). This means that now Publishers can understand the intersection of their content and monetization strategies; it also means that a user that left the website through a click on a DFP or AdX unit, in the past, was considered a simple abandonment, but now you would see them as ads clicked, a considerable improvement in both accuracy and completeness of your Google Analytics reporting.

Below is a quick explanation of what those new integrations will bring to publishers when it comes to understanding and reporting new data.

Publisher Metrics

Publisher Metrics

Following the integration, you will have access to dozens of new metrics on Google Analytics, which can be seen on the interface or can be used in Custom Reports and the Segment Builder. The metrics are similar to the ones you already see for AdSense and similar new ones for Ad Exchange. Below is a list of the overall Publisher metrics and their official definitions.

  • Publisher Impressions: An ad impression is reported whenever an individual ad is displayed on your website (AdSense, AdX, DFP). For example, if a page with two ad units is viewed once, Google will display two impressions.
  • Publisher Coverage: Coverage is the percentage of ad requests that returned at least one ad. Generally, coverage can help you identify sites where your publisher account (AdSense, AdX, DFP) isn't able to provide targeted ads. (Ad Impressions / Total Ad Requests) * 100
  • Published Monetized Pageviews: Monetized Pageviews measures the total number of pageviews on your property that were shown with an ad from one of your linked publisher accounts (AdSense, AdX, DFP). Note - a single page can have multiple ad slots.
  • Publisher Impressions / Session: The ratio of linked publisher account (AdSense, AdX, DFP) ad impressions to Analytics sessions (Ad Impressions / Analytics Sessions).
  • Publisher Viewable Impressions %: The percentage of ad impressions that were viewable. An impression is considered a viewable impression when it has appeared within a user's browser and had the opportunity to be seen.
  • Publisher Click: The number of times ads from a linked publisher account (AdSense, AdX, DFP) were clicked on your site.
  • Publisher CTR: The percentage of pageviews that resulted in a click on a linked publisher account ad (AdSense, AdX, DFP).
  • Publisher Revenue: The total estimated revenue from all linked publisher account ads (AdSense, AdX, DFP).
  • Publisher Revenue / 1000 sessions: The total estimated revenue from all linked publisher accounts (AdSense, AdX, DFP) per 1000 Analytics sessions.
  • Publisher eCPM: The effective cost per thousand pageviews. It is your total estimated revenue from all linked publisher accounts (AdSense, AdX, DFP) per 1000 pageviews.

Those metrics are clearly a major improvement to the measurement capability of publishers, as they will now be able not only to see all those ad interactions from separate DFP networks in one centralized platform, but also to combine this information with other behavioral data that is already being collected by Google Analytics.

As mentioned, in addition to the metrics described above, you might have additional sets: AdSense, Ad Exchange, DFP, and DFP Backfill. Most publishers will not have all 4 sets; they will either have 1 (AdSense or AdX) or 2 (DFP and DFP Backfill [DFP Backfill includes AdSense and AdX served through that DFP network]).

In the next section I describe the difference between those groups and where each ad interaction would appear depending on the tags you are using. I will also go through the new default reports and reporting capabilities in general.

Publisher Reports

Publisher Reports

After you set up the integrations (and assuming that you have all of them, which is not likely), you will have access to quite a few default reports. You will still see similar options in the sidebar navigation, under the Publishers menu, which will read: Overview, Publisher Pages, Publisher Referrers. But for each of those three reports, you will now have 5 options in the report tabs (right above the chart):

  1. Total: sums up all the interactions below, i.e. for Publisher Impressions you would see all impressions including AdSense, DFP and AdX.
  2. AdSense: only AdSense served through the AdSense tag. Note that you do not need the AdSense tag if you are serving it through DFP, and if that is your case you will see AdSense data as DFP Backfill, not as AdSense.
  3. Ad Exchange: only AdX served through the Ad Exchange tag. Note that you do not need the AdX tag if you are serving it through DFP, and if that is your case you will see AdX data as DFP Backfill, not as AdX.
  4. DFP: only DFP metrics (excluding AdSense and AdX backfills) for directly sold campaigns and house ads served through the Google Publisher Tag.
  5. DFP Backfill: all the AdSense and AdX interactions (indirect / programmatic campaigns) when served through the Google Publisher Tag.

But to be honest, I am way more excited about the custom reports that we are now capable of building. For example, for Online Behavior, I wanted to check the total revenue I am getting from each of the authors contributing articles. So I used a Content Group I have been populating with the author name (which is publicly available in the article, hence not PII). Below is a screenshot of the custom report I built.

As you will see, the 5th line is Allison Hartsoe, who wrote two articles for Online Behavior. Even though she has significantly lower impressions and clicks, she produced a pretty high revenue in this period. My conclusion: I should promote her articles heavily and reach out to her for another post :-)

Publisher Analytics insights

The options are really endless here, you could use any Content Group (e.g. interest, content type, page template, etc), Custom Dimension (paid user, loyalty status, etc), or any default dimension to measure your full publisher interactions; one especially interesting section to look at is the Interests report, it will show you the interests of users clicking on ads. This will open up a deep understanding about your customer segments and how they perform when it comes to publisher revenue.

This example illustrates the power of centralizing all your interactions from AdSense, Ad Exchange and DFP in one place. This allows publishers not only to have a deeper understanding of their revenue, but also to act on it.

Linking AdSense, Ad Exchange and DoubleClick For Publishers

In summary, if you are a Google Analytics standard user, you can link your account to AdSense (tutorial) or Ad Exchange (tutorial), but you will have to consider Google Analytics Premium if you are thinking about integrating DoubleClick for Publishers (learn more). If you are a Google Analytics Premium client and did not link your DFP account, do it today! You should contact your Account Manager (or Authorized Reseller) and ask them about it.

As I wrote in the beginning, I think this is a huge step for the Analytics industry when it comes to Publishers, which now have a robust and comprehensive measurement solution.

image 
Publisher Metrics
Publisher Reports
Publisher Analytics insights

Understanding Google Analytics Goals & Funnels

$
0
0
Understanding Google Analytics Goals & Funnels

In previous discussions, I've addressed the need for an analytical mindset, which involves understanding things and the interrelationships between them. Developing an analytical mindset increases and deepens your appreciation of complex systems, including the behavior of users of your website or app.

To promote analytical thinking, I recently posed a question on social media, which I believe to be an excellent place to pose sensible questions:

  • "What features of Google Analytics (GA) are the least understood?"
  • "Which ones appear to make the least sense?"
  • "What features do you want to understand better?"

Some of the responses I got were great:

  1. How do goals and funnels work?
  2. What does direct traffic really mean?
  3. Why does my own website appear in the list of referrals?
  4. What private data are Google Analytics legally allowed to collect?

In this article, I'll focus on the first question: "How do goals and funnels work?" (stay tuned if you're interested in answers to the others). My answer will also demonstrate how approaching a question from an analytical perspective helps you develop a more complete understanding than otherwise. This analytical approach involves breaking apparently complex issues down into constituent components that are simpler and therefore more easily understood. Paradoxically, understanding something at a fundamental level allows you to build up to an understanding of the thing as a whole.

Dissecting Google Analytics Goals

Let's start by setting goals in Google Analytics. Ultimately, you want to know when and how frequently users do what you want them to do on your site or app. Therefore, setting goals on GA involves obtaining meaningful measurements of these key outcomes, and the configuration of your Google Analytics setup must properly reflect these goals for you to get the information you need. Some examples of key outcomes are the following:

  • Make a Purchase
  • Sign up for a newsletter
  • Play a video
  • Download a pdf

These are all actions users can take on your site that deliver value to your business, and all are easily tracked using goals. Because they enable you to quantify the ROI of your site or app, these are likely to be derived from your business Key Performance Indicators (KPIs). As you'll see below, measuring these is an essential part of your GA setup.

Knowing why we need goals clarifies the importance of goal configuration in GA and explains its role as one of the most critical customisation actions you take in setting up GA. However, to get the most out of GA, you also need to understand how goals are used and how they work.

Setting up Goals

Google Analytics offers 20 goals per View, organized in 4 groups of 5. Depending on a user's rights with respect to a view, the user can perform a number of goal-related functions by accessing Goals as shown below. Edit rights on the view allow the user to add new goals, change existing goals, and enable/disable goals in the View Admin section whereas Read & Analyse rights permit only reading of goals (learn more about user permissions).

Goals setting

GA begins tracking from the moment you set a goal but isn't retrospective, that is, it doesn't track the time prior to your setting the goal (unless... well, keep reading to find out more). Once it is set, you can test a goal using the handy utility Verify this Goal link in the goal setup (screenshot below).

Goal verification

Furthermore, goal data are defined as non-standard data processing, meaning that goal data are available only after 24 hours (estimated processing time) have elapsed, even in Google Analytics Premium.

Pre-requirements to understanding Goals

At this juncture, we need to discuss key topics that are fundamental to understanding goals:

  • Tracking: Regardless of goal type, goal metrics (completions, conversion rate, abandonment rate, and value) are derived from your existing GA tracking of pageviews and events; goals require the capture of no additional data, they are only configurations.
  • Sessions: "a group of interactions that take place on your website within a given time frame" - there are other nuances to this definition, learn more.
  • Conversions: "the process of changing or causing something to change from one form to another." Huh?! (see Google definition below)

Conversion definition

Let's look at an example. Assume your goal involves signing up to receive a newsletter. The sign-up process begins when a visitor to your site who is not already a newsletter subscriber arrives. If the visitor decides to become a subscriber, he or she then finds, completes, and submits the sign-up form. Aren't we actually measuring the moment at which a user converts from not being a subscriber to being a subscriber?

In this case, the goal, which is to capture the moment at which this conversion occurs, defines the conditions within Google Analytics' data that represent a user having taken a meaningful action, that is, converting from nonsubscriber to subscriber.

Google Analytics Goal types and definitions

Goals are of four types:

  • Destination
  • Duration
  • Pages/Screens per Session
  • Event

You can set your own goal settings, use a preconfigured template, or download a setting from the Solutions Gallery.

Goal definitions vary by goal type, that is, by destination, duration, screens/pages per session, and event.

Destination

A conversion for a Destination goal is defined using a screen or pageview:

Page/Screen [equals] a value (e.g. /blog/dec/my-article)
Page/Screen [begins with] a value (e.g. /blog/dec)
Page/Screen [matches Regular Expression] value (e.g. \/blog\/(nov|dec)

Duration

As the name implies, a conversion for a Duration goal is defined using the Google Analytics session duration:

Session Duration [greater than] x hours y minutes and z seconds

Pages/Screens per session

A conversion for a Pages/Screens per session goal is defined using the number of screens or pageviews in a Google Analytics session:

Pages/Screens per session [greater than] a value (e.g. 10 pageviews)

Event

A conversion for an Event goal is defined using a Google Analytics event. At least one of the following conditions is needed, but all four or any combination can be used:

Event Category [equals] a value (e.g. Video)
Event Action [begins with] a value (e.g. Play)
Event Label [matches Regular Expression] value (e.g. Homepage Video)
Event Value [greater than, equals, or less than] a value (e.g.  20)

As you can see, destination and event goals use either pageviews or events to define the conditions that define a conversion, while the other two goal types clearly use session metrics (duration and pages/screens). In fact, destination and event goals are actually session based also. How so? What do you mean? you may be asking yourself... Excellent, the analytical and curious mind seeks clarity!

Events, Sessions & Users in Goal conversion rates

This subtle point requires crystal clear understanding on your part to grasp the tricky and, some would argue, flawed nuance of the conversion rate metric. First, let's assume I define a goal based on an event firing, playing a video for instance. Since a user can play a video multiple times during a session, each playing would represent an event and so a goal conversion, correct? Although that interpretation seems logical, it is not correct. To understand why, consider the nature of the conversion in this case:

  1. Before - The user hasn't seen the video
  2. After - The user has played and therefore seen the video

As you can see, the event occurs when the user converts from not having seen the video to having seen it, and so goal measurement is the number of sessions that include at least one video play. The key to understanding events is to remember that goal conversions are recorded per session, and so the conversion can occur only once within a session. The specific metric used here is conversion rate, which is calculated as follows:

Sessions with specific Goal completion/Total sessions

Using session rather than user as the goal's container provokes intense disagreement. One side argues that, since the user and not the session has converted, conversions and conversion rates should be user based. The other side counters with an argument involving the personal area network (PAN) device issue. This side asks, can you consistently and accurately determine that the user is the same individual when multiple devices are included in the user's PAN? The answer is, not easily and not all the time, and therefore session is a more appropriate container for the goal than user.

Although Google Analytics typically uses sessions in goal tracking, it does provide a means to use goal completions by users to calculate conversion rate, the recently rolled out Calculated Metric functionality. The excellent articles linked below cover this subject thoroughly and in sufficient detail, and so I won't belabour the point here:

Dissecting Google Analytics Funnels

Our understanding of goals within Google Analytics is now significantly deeper than it was on the beginning of this article. Let's take another step deeper into the rabbit hole and consider goals having multiple steps.

The funnel goal still uses a final destination page to determine when a conversion occurs. This goal type uses pages only; event sequences are not supported. The sequence of pages or screens is defined using the same constructs (equals, begins with, and regex) and, as shown in the screenshot below, each step is given a name to make the steps human readable and so aid in analysis and reporting.

Goal funnel

The screen above shows a standard ecommerce checkout funnel modelled with a goal funnel. Regular expression matches are employed, and the match method used for the destination page is also used in the goal funnel.

As one example, assume that this goal is a guest checkout in which users enter the minimal information required to complete the order but don't log in or create an account. Consider the identical example except that the users do log in to an account prior to checking out. Although the checkout steps differ, the goal destination remains the same, meaning that GA will record the same number of conversions for the goal in both examples. Hence, even though the steps are essentially different, the goal conversion rate for each goal type will also be the same. Whilst the funnel conversion rate will differ (because the steps in the goal funnel differ).

Where two divergent funnels are used to represent different outcomes, ideally the destination pages will also differ so that two distinct funnel conversions and funnel conversion rates will be delivered, one for each funnel.

Another possible funnel configuration involves starting point. Notice the example above stipulates that the first step in the funnel is Required. This is therefore known as a closed funnel, which allows users to enter the funnel only by first visiting a page matching the condition defined in step 1. An open funnel, where the Required flag is not set, allows users to enter the funnel at any stage of the process. You can use this setting to ensure that the tracking matches the functionality of your site. If users can enter the checkout, for example, at any stage, then the funnel needs to reflect this. If, on the other hand, users can only check out starting at the basket page, the funnel needs to be closed using the Required flag on the basket stage.

Importantly, for the same number of checkouts, open and closed funnels will show the same number of goal conversions and the same conversion rates, but their funnel conversions and funnel conversion rates will differ.

Goal Funnel vs. Goal Flow reports

Now you should have a firm grasp of the basic goal types as well as of the more complex goal funnel. When you begin asking questions about typical user behaviour and how to model it in a funnel, you add further intricacies and complexity to the funnel functionality. For instance, staying with the standard ecommerce checkout example, consider users who fill in their delivery details, proceed to the billing section, and then remember that the address they used is the wrong one. No biggie, they just navigate back one step, update, and carry on. But what effect does this detour have on the goal funnel?

The answer lies in the goal funnel report. There you'll see the user hitting the delivery goal step, progressing normally to the billing step, but then exiting back to the delivery goal step. This is called a loop-back and is more clearly visible in the goal flow report than in the goal funnel report.

What if you have an optional step in the checkout, for a coupon, for example? Although the goal will be configured to incorporate the coupon checkout step, users who skip the coupon step and go straight on to the order confirmation will be subject to back filling in the goal funnel report. Therefore, you'll see an entrance to the step prior to the coupon stage, progression to the coupon stage, and then the hit on the order confirmation stage. The goal flow report shows skipped steps with more clarity, with a separate flow shown for users who skipped a step.

The power of the goal flow report is not widely appreciated, and these simple examples illustrate how the goal flow report offers clearer reporting for funnels. Read this help center article in the GA docs to learn more about the differences.

Google Analytics Premium Custom Funnels

As mentioned earlier, traditional goal funnels are session based and only employ pageviews. However, your checkout funnel may employ pageviews and events to more finely tune tracking of user behaviour during this essential part of a transaction. You may want to track converting journeys across sessions and decide yourself whether to use sessions or users in your conversion calculation. Whilst the standard goal funnel is a poor fit for these requirements, Google Analytics Premium users have the wonderful Custom Funnel functionality at their disposal.

The screenshot below shows the wealth of powerful options that can be used to customise funnels:

Custom funnel options

Moreover, powerful rules concerning pages and events can further define funnel stages:

Custom funnel rules

Indeed, pretty much every standard reporting dimension is available to allow fine tuning of funnel stages:

Custom funnel dimensions

Deep joy ensues when the funnel is set up and you see it applied to data retrospectively, thereby allowing further fine tuning of the funnel for historical analysis:

Custom funnels

Custom funnels are still flagged as beta functionality, and so, whilst massive power is currently available, expect more in the future.

image 
Goals setting
Goal verification
Conversion definition
Goal funnel
Custom funnel options
Custom funnel rules
Custom funnel dimensions
Custom funnels

Demystifying Google Analytics Direct Traffic

$
0
0
Google Analytics Direct Traffic

A common question you might ask when going through your Google Analytics data is "Where do my users come from?" This will help you understanding how users find your website or app and which sources of traffic are working well (or not). Using this information wisely, you can quantify the value of your paid marketing campaigns, optimise Organic traffic, find out how well your newsletter emails are performing and more!

You probably know your main traffic sources but do you know how many possible traffic sources there are? Literally, how many sources do you think your site could have? Apart from the campaigns you've set up in AdWords or emails, have you ever considered what other traffic sources are available? There are more traffic sources then you may think.

You might not know this but there are potentially hundreds if not thousands of potential traffic sources! This poses a major challenge to Google Analytics to make sense of where all your users came from and to present accurate data in reports. Google Analytics always tries as hard as possible to give you accurate data about your traffic sources but sometimes it's not so straightforward.

Enters the Direct source and (none) medium. You'd expect Direct traffic to represent your most loyal users who know how to find your app or site by name. Right? Not always. There's more to Direct than you might initially think.

This article will cover exactly what Direct means, how to understand it, when it is correct or due to an error and how to solve it.

How does Google Analytics calculate a traffic source?

When a user arrives on your site, how does GA decide whence they came? GA uses a fairly simple algorithm, which is described in the flow chart below. It is long, but keep on going!

Traffic Source processing flow chart

There are a few technical terms in this diagram so let's work through a series of use cases to better understand what they mean.

How GA identifies visits from Organic search?

Go to Google, search for ConversionWorks and click through to my homepage. How would GA know you came from a Google search? Thanks to the magic that enables the internet to run (HTTP), we can see in the Request Headers that the site I came from, the Referrer, was https://www.google.co.uk

HTTP Request

Okay, but looking at my GA data in the Acquisitions report (Source/Medium) I can see google / organic, google / cpc and a bunch of referrals from other Google properties. How does GA differentiate between a Google search visit, a click on an AdWords ad and the other referrals?

Being a Google product and knowing that Google knows a thing or two about search, GA knows how to recognise a visit from a search engine. You can see the list of known search engines and how they're identified using the Referrer value here.

How GA identifies visits from a paid ad click?

Okay, so GA can spot a referring search engine but what about paid traffic? When we talk about paid traffic we might mean AdWords or DoubleClick for example. If paid campaigns using these ad networks are setup using auto-tagging then GA will look for a query parameter on the URL when users land on the site. A user coming to ConversionWorks from a click on an AdWords ad will have a query string parameter named gclid appended on the URL like this:

http://www.conversionworks.co.uk/?gclid=gfjsgWSAR45jdxn32hjkh4324n

A DoubleClick ad click will have a query string parameter named dclid. GA can see these parameters (gclid and dclid) and can decide not only that this visitor came from a paid ad click but also which ad they clicked, which campaign, the cost of the click and so on. Pretty amazing and super valuable data!

If the user came from a Google property with no query string data and it wasn't a search property then the visit must be a referral.

How GA identifies visits from custom campaigns?

GA looks for some other standard query string parameters too - the utm parameters:

  • utm_source
  • utm_medium
  • utm_campaign
  • utm_content
  • utm_keyword

You can append these querystring parameters to links in emails, social shares, non-Google ad networks or links in pdf documents. You can use these very powerful parameters to control the data in your acquisition reports. You can use the official Google URL Builder to help build your links.

Be careful though! Using these parameters on internal links on your website (from one page on your site to another page on your site) will cause a new session to start artificially. If you change the campaign value, you'll start a new session which you probably do not want to happen. Best use these guys only when linking from an external source that you want to track explicitly.

What Direct means and when can it happen?

If GA can't determine that a user landed on your site from a recognised campaign, search or social source, there's no existing campaign data, it's a new session and no referral data is available: you'll find yourself at the very bottom of the traffic source algorithm flow chart - it's a Direct visit.

But is it really Direct?

Just because GA decided a user was Direct doesn't necessarily mean this is exactly what happened. The analytical mind will always question a data source. Is this data a true reflection of reality? How can I trust this data? How can I prove this is correct?

What grounds do we have to doubt the accuracy of the data? How do we reasonably question the data and go about calibrating it?

Questioning your data

Let's assume you have some expectations regarding the volume of traffic you'll get from 'owned' campaigns such as Email, AdWords, DoubleClick or Social. In addition to expected volume, you probably also know where users are going to land on the site. Now you need to check your data. Ask:

  • Are users landing on the pages you expect?
  • Are the landing pages being hit by the right campaigns?
  • Does the session volume look right?

Acquisition report gross error checking

This kind of "gross error checking" exercise is seldom performed. This is such an important exercise to conduct if you're going to trust your data.

Compare your click data from AdWords with GA data. Are you seeing the right number of sessions compared to clicks? The numbers may not match exactly but anywhere within 5% is about right.

Check other source/medium combinations for your landing pages in GA. Seeing anything untoward or untrustworthy? If your campaign traffic volume is south of your expected value then this is where you might see more Direct traffic than you might expect.

Is this a good or a bad thing? How to spot and fix issues?

Users can always type a URL into their address bar or click a bookmark, of course, and this is genuine Direct traffic. However, there is also a chance Google Analytics may not be able to correctly attribute the user's session to a traffic source, in which case the session is flagged as Direct. Here are a few scenarios:

  • Clicking from a secure site that uses https to in insecure site that uses http
  • Clicks from apps
  • Untagged or incorrectly tagged links (most common)
  • Measurement protocol hit

If any of the scenarios listed above happen then this will cause GA to flag the session as Direct which is potentially not right.

HTTPS to HTTP

This is the way the internet works. If you're on a secure site that uses https, part of the security is that when you click through to an insecure site using http, the insecure site is prevented from seeing where you came from - no referrer data is available to GA on the insecure site.

Secure sites like Google and Facebook are quite clever in that they do expose referrer information when you click through from their secure pages to insecure pages on other sites. We don't need to go into how they do this in this post but the simplest solution is to run your site on https. This is good for your users. Give them peace of mind knowing their browsing experience is secure and you'll have no worries about losing referral data. That's an easy trade. Talk to your engineers and get it done already!

Clicks from apps

If users click on links to your site from within an app, GA can't see which site they came from because they didn't come from a site! They came from an app which is an app... not a site in a browser. The app won't necessarily send referral information which confounds GA and you end up with incorrect Direct traffic.

It's quite possible the clicks from apps are valuable. If you treat clicks from apps as a monetisable channel then you need to track these clicks properly.

Use utm tagging (also known as manual tagging) to decorate the links in the app with campaign data. If you've never done this, take a look at this handy resource provided by Google.

Untagged or incorrectly tagged links

This is a very similar scenario to the last one. Maybe you don't have links in apps but if you have links in emails or maybe even pdf documents, Word documents or Excel spreadsheets, these are not browsers and might not send referral data for GA to latch on to. You need to use manual tagging again.

What if you are using manual tagging but had a little finger trouble? You did test the links right? They went through to the right page but did you check the GA data for the right source, medium and campaign values?

If you click this link, you'll end up on our homepage:

http://conversionworks.co.uk?utm_sorce=onlineBTest&utm_medium=onlineBTest&utm_campaign=onlineBTest

Looks okay? Can you spot the issue? utm_sorce is not a correct utm parameter. Make sure to double check every time you create a campaign link or (better) use an automatic solution. A great way to check if the link works correctly is to use it once and use real time reports in GA to check it works correctly.

Measurement protocol hits

Have you heard of the Internet of things? Internet connected devices that can talk to other things on the internet: fridges, fitness trackers, cows... yes, even farm animals. None of these things are browsers but they can all potentially send data to Google Analytics using the Measurement Protocol. The Measurement Protocol is what makes Universal Analytics truly universal. It's a technique provided by Google Analytics for non-web browser technology to be measured using GA.

GA data sent via the Measurement Protocol might be flag ged as Direct if it is not decorated with campaign information. You can check the data to see if these hits are from things rather than users quite simply. Knowing that things are things and not browsers means we can use common dimensions in GA to see real browsers. Real browsers will automatically expose the screen resolution, the computer operating system and the flash version being used amongst others. These appear as dimensions in GA reports.

So, for example, a property that was only populated with Measurement Protocol hits might show (not set) for all Measurement Protocol hits on the Operating System dimension. Similarly, you would see no Flash Version, no screen resolution or Screen Colours. These dimensions are all available to see in the Audience -> Technology -> Browsers & OS report.

See how adding a secondary dimension of Source / Medium helps us narrow down the data to check exactly what's going on? This is a useful technique to learn and use.

Conclusion

This essay has shown how GA decides where a user came from. You've seen how this can work and you've seen how this can fail. Knowing these details, plan a review of your traffic source data. Do some gross error checking. Do some calibration. Check your data and build confidence in the numbers.

If you find any holes, you're better armed with explanations and fixes. You may find more value in certain channels and optimisation opportunities in others.

You're on your way to using data more wisely. Good!

image 
Traffic Source processing flow chart
HTTP Request

The New Google Analytics Mobile App

$
0
0
The New Google Analytics Mobile App

Today the Google Analytics team is launching a completely revamped Mobile App, which will make your data consumption easier, faster and more agreeable. The team worked very hard to make the App more robust too, you can do way more with it now, a great companion when waiting for a meeting to start, during your commute to/from work and when waiting for your kids at school!

In this article I will go through the main functionality that will now be available to you, including the scorecards, dashboards and visualizations. But if you are the video-kind-of-person, Ajay Nainani (GA Product Manager) recorded a great walkthrough.

If you want to get the functionality discussed below, go get the app, or update your current version either at Google Play or at iTunes Store.

Interacting with scorecards

The first thing you will notice when opening the app is the new Material design. Even if you haven't read about it, you have probably seen many Material design cards, which are used in different Google products. Their introduction into the Google Analytics app brings consistency and beauty into the product.

In a business context, a scorecard is a record used to measure achievement or progress towards a particular goal, so that seems like a proper name to the cards used in Google Analytics. And the cards are not just beautiful, they are extremely functional, providing a series of new capabilities. Below you will find a screenshot pointing to all actions you can take inside a scorecard.

Google Analytics scorecard

  • Pick a metric - tap on any metric to update the chart. Sometimes there will be additional metrics that are not visible, you can swipe left to view all metrics available on any scorecard.
  • Pick a dimension - swipe the visualization to the left to uncover new dimensions and visualizations (see the dots below the visualization, they indicate how many dimensions are available for any card).
  • Focus on a data point - tap on a data point in the chart (e.g. a day, a country, a device type, etc.) to focus on it. This is available on all chart types.
  • Share the report with anyone - tap the share icon to use your device native sharing functionality. This will share a screenshot of the report in question through any apps you have available, and it automatically adds the date and View you are sharing from. This is a great way to spark a data conversation with your colleagues!
  • Save to dashboard - tap to add the scorecard to your dashboard, we will discuss this functionality later in this article.

Now that you are an expert in interacting with scorecards, let's review the data available through the app by reviewing the overview section, your "landing page".

Data Overview

One of the areas that was significantly improved is the Overview, it now provides you with a great bird's-eye view of your data. And the charts look so nice that you will spend hours in it, without even noticing :-)

Data overview

Here is the information you will find there:

  1. Real-Time - see how many users are on your site/app right now.
  2. Audience overview - learn how your users, sessions and new users are trending over time. You can also swipe the chart to compare your device usage, traffic channels, and countries.
  3. Behaviour overview - learn how users are behaving on your website by checking the following metrics: Avg. session duration, Bounce rate, Pages / Session, Number of sessions per user. You can swipe the dimensions to find the same break-down as in the card above.
  4. Goals overview - learn how your Goal conversion rates and absolute Goal completions are doing. The same dimensions are available to drill down into those metrics.
  5. Users by time of day - learn how traffic is spread across the week in your View. This is a great example of an insightful and beautiful visualization!
  6. Total events cards - you will find three cards showing the your top Events by categories, actions and labels. They will help you understand what kind of actions your users are taking (you need to have Event Tracking implemented for that).

Hopefully this will give you a good understanding of what you need to know in a glance. I won't go over each section, but if you click on the left sidebar you will find options to go deeper into Real-Time, Audience, Acquisition and Behaviour. You will notice that a lot of thought went into choosing the right metrics and dimensions for each of them.

Next, let's go through some dashboarding functionality to learn how to build a great chart repository with your most important dimensions and metrics

Building a dashboard on your mobile app

As mentioned above, while browsing the app, you can tap any report you find useful to add it to your dashboards for future analyses. In addition, once you are in your dashboard, you can duplicate and customize those reports to get just what you want. You can also create new reports from scratch from a wide selection of metrics, dimensions and visualizations.

Mobile App dashboards

As you can see in the screenshot above, you have a few options when creating or editing a report:

  • Metric - choose one metric per report.
  • Dimension - choose one dimension per report.
  • Segment - This currently works only with Segments created on the Web interface, you can choose any existing Segment.
  • Visualization - depending on the metrics and dimensions you choose, you will be offered a set of visualizations that can be used to represent them.

Go on, build some reports!

And now, to close with the cherry on top, I would like to add some beautiful visualizations which, for me, makes it all a pleasant journey!

Visualizations: Cherry on Top

Time of the day heatmap

I find this chart incredibly insightful, it represents a large amount of data in a simple way. For example, you could compare when is the highest amount of traffic versus the highest conversion rate times of the day and use it to optimize your paid traffic.

Google Analytics Heatmap

Comparison bar and column charts

The bar and column charts below are quite nice to the eye. But what I really like about them is the way comparison was implemented, very elegantly! While the current time period fills the whole bar, the comparison period is just a line, clear and clean.

Comparison Bar chart

Comparison Column chart

Pie chart (Good ol' boy!)

No, I won't really go into an essay on why you should or shouldn't use pie charts, but they will be there for you if you want them! Sometimes they might be useful, but use them responsibly...

Google Analytics Pie Chart

Now you know it! The new app is pretty cool, you should give it a try. Go get the app, or update your current version either at Google Play or iTunes Store. And happy analyzing!

image 
Google Analytics scorecard
Mobile App dashboards
Google Analytics Heatmap
Comparison Bar chart
Comparison Column chart
Google Analytics Pie Chart
Mobile App dashboards
Online Behavior
Data overview

Setting Up Google Analytics to Measure Content

$
0
0
Content Measurement

The content you provide to your visitors can make or break your brand. At the end, the information that you provide to your visitors plays a huge role in whether they buy your products and services or not. That's why it is crucial for you to understand how your content performs and which steps you need to take to improve your content marketing results.

Google Analytics tracks several metrics by default: time on page, exit %, bounce rate, etc. With these metrics you just scratch the surface of what insights you can get through your content reports.

Configuring Google Analytics in the right way will help you answer the following questions:

  • Which content resonates best with your readers?
  • What landing pages drive the most conversions?
  • Which secondary actions on a specific page help you sell more services?
  • How to improve your future content marketing efforts?

In this article you will learn how to set up Google Analytics to make your content reports super actionable.

Step 1: Measure All Your Pages

This might sound like a no-brainer to you, but it's a crucial first step. Even in a very simple implementation, all pages get tagged almost automatically.

Three examples where things can go wrong:

  • "A page is not important and doesn't need to be tracked." I strongly recommend to tag every page on your website, even if you come up with a very good reason not to track some of your pages. Instead of not tagging a page, you can exclude the page via a filter on your Views.
  • The website involved contains a few subdomains that are not tagged correctly. This often happens with Ecommerce carts or "mini-sites" that companies create for special promotions.
  • External applications, e.g. Leadpages, are used. Very often you need to manually insert the tracking code on these pages to make sure everything get's tracked.

To help on that, Google create Tag Assistant, a Chrome extension that can be used to create validate, diagnose, and troubleshoot your Analytics data on each of your pages. Once you create a recording a detect a problem, you can correct the setup and check again whether your tags are firing properly. Here is a quote on how it can help you, learn more about it in the Help Center.

Recording a flow helps you validate that Analytics tracking on your site works as expected. Choose a critical set of steps to record. These steps typically include pages on your site. You can also include pages from other sites from which a user may initiate a visit to your site, such as google.com, or a third-party site with an ad linking to your site. For example, if you run an ecommerce site, run through the pages and steps required to select an item, place the order, and submit payment. If your site's purpose is lead generation, walk through the process of navigating to and signing up for your promotional newsletter.

Google Tag Assistant recording

Step 2: Exclude Your Own IP Address

Excluding sessions from your own company or any known third party is a must if you want to collect meaningful data. The larger this group relatively is, the bigger the impact on your numbers.

I have dealt with large companies where everyone sets the company homepage as the starting page in their browser. This can heavily impact key metrics like conversion rate, bounce rate and several more metrics.

Here is the filter you need to set up:

IP Address Filter

Learn more about exclusion options, it may be useful if you need to set up a filter for a range of IP addresses.

Step 3: Exclude Your Technical Query Parameters

I have come across thousands of duplicate pages in a lot of Google Analytics accounts. This is often because of a large amount of technical query parameters on the website involved.

Here is an example where you would want to see only one page, but Google Analytics would register three different pages when using a default implementation:

  • www.test.com/duplicate-page/
  • www.test.com/duplicate-page/?id=12356
  • www.test.com/duplicate-page/?id=15687

Two Ways to Fix it

Option 1: remove your query parameters in the view settings. Add them to the field shown below and separate each of them with a comma.

Exclude URL Query Parameters

Option 2: remove all your query parameters with a filter. Please be careful when choosing this option. You will remove all your query parameters if you apply the filter below.

Exclude all query parameters

Please note that filters don't work retroactively.

If you need a reminder on how filters work, read this comprehensive filters guide.

Step 4: Add Interactive Events for On-Page Actions

Google Analytics solely tracks pageviews by default. In a world full of on-page videos, buttons etc., you need to think about Event Tracking, which is the best way to measure on-page interactions in addition to measuring basic pageviews. And they can greatly influence your content metrics.

Here is an example, where the main call to action is a play button that would not be tracked by default on Google Analytics. In this case you would want to add an event to that button.

Event Tracking Video Example

You need to create a measurement plan where you list all your website interactions. Determine for each of those whether you want to measure it as an interactive event (impacting bounce rate) or non-interactive event (not impacting bounce rate). This will make the metrics in your content reports reflect what's going on on your website more accurately.

Step 5: Set up Goals and Goal Values

Two of my favourite reports in the content section of Google Analytics are the All Pages and Landing Pages report. There are two metrics that don't provide insights from the start: Page Value (located in all pages report) and Goal Value (located in landing pages report).

You need to set up a few things first.

Here is how the reports will look like once you have set this up correctly.

All Pages report

All Pages report

The last column represents the value of each of your pages in relation to your conversion goals. In this case page number six has the highest page value (Ä 24,64). This report will provide you with great insights on which pages are visited before a visitor converts. Funnel steps naturally get a lot of credits so make sure to pay a lot of attention to pages outside of your funnel too.

It's important to understand how page value is calculated. And make sure not to add a goal value to your ecommerce thank you page if you have implemented ecommerce tracking on your website, otherwise you will measure things twice.

Landing Pages report

Landing Pages report

The last column represent the value of each of your landing pages in relation to your conversion goals. In this example, landing page four generates a decent amount of traffic but no direct value. You can create an overview of the total goal value of all your goals like I did in the example above.

If desired you can switch "All Goals" in the top right corner to one specific goal and determine the specific value of this goal in relation to your landing pages.

Step 6: Aggregate Your Most Important Metrics in a Custom Report

Do you find yourself switching a lot between different content reports? Or changing the settings all the time? Before losing yourself in the fancy world of automation and APIs, I recommend to play around with the custom reports module in Google Analytics. It is a really effective way to get a better feeling of which metrics matter most and are very actionable. And how you would like them to be displayed and aggregated.

Make sure that you implement step 1 to 5 first before moving on to creating custom reports for your organization.

A quick example on how to do it:

  1. On top of your screen click on Customizations.
  2. Click on New Custom Report.
  3. Create your custom report.

As an example I have built a custom report for a content publisher who focuses on growing her subscriber list:

Landing Page Effectiveness Report

And the actual report looks like this:

Custom Report Landing Page Effectiveness

Now she can immediately see how each of her landing pages (mainly blog posts) perform in relation to different conversion goals. And these metrics are really important to her to map out her future content strategy.

Normally I would include a goal value metric as well, but she still needs to define the value of each of her conversions. That would make the report even more powerful.

In short, custom reports give you all the flexibility you need to present your content related metrics in the most actionable way.

Bonus

If you want to learn more about analyzing your content in aggregated clusters I recommend to check out my brand new guide to Google Analytics Content Groupings.

This is it from my side. I hope you have learned a lot of new things to better analyze and optimize your content performance. We very much appreciate a share if you enjoyed reading this post!

image 
Google Tag Assistant recording
IP Address Filter
Exclude URL Query Parameters
Exclude all query parameters
Event Tracking Video Example
All Pages report
Landing Pages report
Landing Page Effectiveness Report
Custom Report Landing Page Effectiveness

Google Analytics & Search Console: Deep Integration

$
0
0
Google Analytics & Search Console Deep Integration

Today the Google Analytics team is announcing a new, more robust, integration with Search Console. Exactly a year ago I published my Google Analytics Integrations book, where I discussed the old integration (still named Webmaster Tools), and I am really happy to see this major milestone! Although this reminds me I have to update the book :-/

I think this is a huge improvement for the SEO community and anyone trying to understand how to analyze and improve Google Organic traffic. So first of all, congratulations to Joan Arensman (GA Product Manager) and team for the launch of these new capabilities!

Note: If you still don't have your accounts linked, check this Help Center article.

In this article I will go over the highlights of the new integration and describe use cases on how the improved reports available can be used to analyze Search Engine Optimization and extract insights from the data.

Search Console + Google Analytics: Two Highlights

In my opinion, the most important change is that you will now be able to see Search Console metrics alongside with Google Analytics Behavioral metrics (pageviews/session, bounce rate) and, most importantly, Conversion metrics (Goals and Ecommerce conversion rate).

In addition, I think it is important to reinforce that there has been no changes to how keywords are handled, they are still encrypted by the Search team and therefore will not be joined with GA data. The solution to that was to join the data on the landing page level, allowing you to do even more advanced analysis than the ones proposed in this Help Center article. Here is a quote:

Landing pages are a good signal for analyzing organic search traffic because each landing page has likely been created around a focus keyword, product, or theme. As a result, incoming keyword searches generally relate to the focus of the page.

Metrics, Dimensions and Reports available

Google Analytics Search Console data

So, what data will be available in the reports? In terms of metrics, there is nothing you haven't already seen. The organization in the tables is very similar to other reports, especially the AdWords Final URLs (available on properties where AdWords is linked to Google Analytics). You will find the usual Acquisition, Behavior, Conversions break-down, where the Search Console (SC) metrics are part of the Acquisition bundle. Here are the metrics:

  • Acquisition: Impressions (SC), Clicks (SC), CTR (SC), Avg. position (SC), Sessions (GA)
  • Behavior: Bounce rate (GA), Pages/session (GA)
  • Conversions: Goals / Ecommerce conversion rate (GA), Transactions (GA), Revenue (GA)

In terms of the reports available on the new integration there isn't a huge difference, apart from the Devices report, which provides a new way to segment your Google organic data. The major difference (as mentioned above) is the new capability available through those reports.

  • Landing Pages: Shows all metrics mentioned above, with each landing page as a single row. Click on a specific landing page to see the queries that led traffic to this page (note that for queries you will only see SC metrics).
  • Countries: Shows all metrics mentioned above, with each country as a single row. Click on a specific country to see the landing pages for traffic coming from this country. Click on a specific landing page to see, for this country, the queries that led traffic to this page (note that for queries you will only see SC metrics).
  • Devices: Shows all metrics mentioned above, with each Device Category as a single row. Click on a specific device to see the landing pages for traffic coming from this device. Click on a specific landing page to see, for this device, the queries that led traffic to this page (note that for queries you will only see SC metrics).
  • Queries: Shows each query as a single row, with only the SC metrics showing. No drill-down available

Acting on Search Console data

Data is only valuable when it drives action, and the power of this new integration capabilities is that it makes Search Console and GA data more actionable. Below I will go through two use cases mentioned on the launch post, trying to show how to find those kinds of insights.

1. High CTR + Low site engagement

A high CTR (click-through rate) means that when users see your snippets on a search result page, a high % of them end up clicking through to your website. However, a low site engagement means that once they land on your website they are not doing much, which is not good. The way forward in this case would be to improve your landing pages and make sure they work well for your organic traffic.

In the example below we can see two landing pages and their associated metrics. The second one has a high CTR (11.42%), but a very low Ecommerce conversion rate (0.73%).

Search Console data analysis

Let's focus on the second page for a moment: High CTR + Low Engagement (in this case Ecommerce). In order to get more value out of this Landing Page, we would need to focus on the website, making sure it is delivering a good experience to the users. The best way of doing this is through testing, either using one of Google tools (Optimize 360 or Content Experiments) or other testing tools. Make sure to create your test only for users coming from Google Organic to understand how to improve your performance.

2. Low CTR + High site engagement

In the opposite extreme of the the second page discussed above is the low CTR landing pages with a high site engagement (first page). This scenario requires SEO optimization rather than website optimization, as the page is not bringing enough users but is being very successful in engaging them.

It is clear that investing in increasing search CTR for the first page could be highly profitable. While I am not a SEO and have no advanced expertise in the area, I believe that customizing search snippets could be a way to start optimizing CTR from the search results pages.

Closing Thoughts

Adding context to data is what makes it useful and actionable. With this new integration you have a whole new set of data that provides context to your SEO analysis, enabling more data-driven SEO. Back when I wrote my book, I said the following about the AdWords Integration, I am very happy I can say the same about the Search Console Integration now!

Linking Google Analytics to AdWords and to Search Console is essential to professionals using those tools. It allows marketers and website owners to go beyond success and failure, to understand not only which campaigns are failing, but also what happens to users who do not purchase (or complete any other goal) during their sessions. This information is critical to optimize campaign performance by shedding light on which campaigns are failing as a result of suboptimal targeting, poorly designed landing pages, or poor ads.

image 
Google Analytics Search Console data
Search Console data analysis

Creative Uses to 5 Google Analytics Features

$
0
0
Creative Google Analytics

If someone were to pull open a toolbox and pick up a hammer not knowing what it is, his simplified description of a hammer would probably be "tool used to pound in nails." That simplification does the hammer little justice for the range of utility it actually has, it can be used for a lot more than just pounding nails.

Many features of Google Analytics, like the hammer, have a range of applications outside of their initial intended purposes. We are aware of what most of these features are for, but let's showcase some other ways to use them.

1. Inline filter to verify destination goals

When you're creating a destination URL goal you're essentially flagging a pageview when it is executed on a particular page, or pages, as a conversion. When building more complex goals that convert on multiple pages or creating a funnel step that can be multiple pages, you'll need to start using regular expressions to make it work. Let's say for example you have five contact forms on five different pages but all of these contact forms are essentially the same conversion. They all go to the same sales team and it doesn't matter what form the user fills out.

Regular expressions can get complicated and you're one mistake away from false positive conversions or inaccurate data. To top it off, you can't see historically how this goal would have converted. Luckily, we have an entire report in GA that will tell you what pages are getting pageviews. In GA's reporting section navigate to Behavior > Site Content > All Pages. Using the inline filter advanced options paste in whatever value you gave your destination goal URL.

Destination Goals
Advanced Filter

Recreate your goal configuration and match type in the inline filter and the resulting output will be all of the pages that would match that goal. Now you can create a goal that will be 100% accurate on day one!

2. Debug your implementation in real-time

Normally it takes 24 hours for data to fully be committed to Google Analytics reporting database. If you're lucky enough to be on Google Analytics 360, the freshness window is four hours. For some purposes that's just fine, but let's say for example that you've got a big marketing push going out tomorrow and you've been asking for weeks for development to put in some tracking on your landing page. Development finally put it in and they're on the phone with you asking, "Is this good? Is this what you wanted?" 24 hours won't cut it, you need to know now if this page change happened and if it's collecting what you want.

This is where real time reports really shine. There's tools to debug the website to ensure it's sending to GA, but real time reports will verify everything is being received loud and clear. The trouble starts when you want to verify this on an active part of your site. Clicking around to trigger the tracking you want may send the data but imagine if there are a ton of other active users on the site.

Real Time

How can you be certain your actions, or sequence of actions, triggered what you wanted? First you're going to isolate yourself in the real time reports by making you as a user completely unique and filtering by it. To do so navigate to the Real Time > Content report. Open a new browser tab to the site and page your debugging and add a unique query string parameter to the URL. It can be anything, so long as you're the only user with it. Then in the report use the inline filter to filter by that unique aspect.

Cardinal Path website
Google Analytics Real Time

When you do so, you'll see a little blue bubble pop up at the top of the real time reports to show you what you're filtering by. The kicker here is so long as you stay in the real time reports that filter will persist. You can navigate to any aspect of the reports from events, content, goals, to anything else and verify the data you're sending from the page. You can debug your implementation with no development experience needed.

3. Create engagement goals with custom segments

One of the more overlooked types of goals or conversions is user engagement. Maybe you have a website where there's nothing for sale, there's no forms to fill out, and there's no purpose outside of brand awareness. You can measure success with an engagement goal to find out how much of your site your users are soaking in by creating a duration or pages/screen per session goal.

For this you'll need to set a threshold value to either the time on site or number of pages viewed. The answer you come up with is going to be very unique to you. There is no industry standard here and the reason for this is because the length and content of a single page on any website is going to be different from page to page, from domain to domain. As the resident analytics ninja it's your job to remove the guesswork out of the data so let's come up with a perfect number unique to your site.

Creating an engagement goal that focuses on pages per session is essentially leveraging the page depth metric. It's asking, how many unique pageviews should be considered a conversion? There's different levels of engagement that would denote varying levels of success. I would say the cream of the crop in terms of sessions would be the top 10% of my users, I would love for everyone that visited the website to behave like them.

To find and define this cream of the crop, choose a 1-3 month period of time you want to benchmark against. That's your baseline or "control" for this test. In any report at the top choose Add Segment > New Segment > Conditions. Create a single condition that filters by "page depth greater than." Now all you need to do is keep changing the value you feed it until the percentage meter on the right closely matches the percentage engagement you'd like.

Google Analytics segment

Looks like my number is 2! Jump back over to the goal configuration and create a goal with a threshold of 2 pageviews per session and you will have an engagement goal that's tailored to your website and business objectives. You can save the segment if you'd like for future analysis but what we were really after was that percentage output in the advanced segment interface.

4. Intelligence alerts for technical problems

Depending on the size of your company or client, you may find yourself wearing a lot of hats around the office. Nothing induces pandemonium faster than the website going down. You got hired to be the marketing person and started using analytics to measure that. You put analytics on the page so now you're considered the website developer. Since you're the website developer of course that now means you're in charge of the website hosting. There's no budget to fill any of these roles with someone in a background for it let alone pay for any tools needed to upkeep.

An extremely simple alert you can set up in GA will tell you when data stops coming in, typically this means because the website went down. If GA isn't collecting data then something fell off the rails. In GA Navigate to the admin panel, select the view that collects all of your data, click the custom alerts sublink, and create a new alert.

Intelligence alerts

Using the following configuration will send you a quick text message. The lowest period you can set is for "day" which works well over weekends and if you have a website you're not on every single day. You can swoop in and save the day without having to beg for server monitoring services or software.

5. Hyperlinks in annotations

On any report with a timeline you have the ability to add annotations. These are basically little post it notes you can stick to the timeline to let others know anything of particular importance you need to share for that day. It's a common analysis request from clients to explain peaks or valleys in the trend line and rarely is it a simple explanation. These annotations are perfect for flagging that peak or valley as important to the team or even your future self however rarely is the 160 character limit enough to do your analysis justice.

In a concept stolen right from good internet marketing, these annotations are a perfect place to drop a shortened URL to an internal report. Creating a report on dropbox, Google drive, or your favorite document repository will have all the login barriers that you need to keep the report safe and using a link shortener to that file will fit in the annotation.

Want to learn more tips and tricks for Google Analytics? Download our free ebook, Google Analytics Tricks for Conversion Rate Optimization and check out our online courses.

image 
Destination Goals
Advanced Filter
Real Time
Cardinal Path website
Google Analytics Real Time
Google Analytics segment
Intelligence alerts

Google Data Studio: A Step-By-Step Guide

$
0
0
Google Data Studio

This week the Google Analytics team released some very exciting news: Google Data Studio will be available to everyone! There will still be the robust Data Studio 360 for enterprises, but a standard version will be available to everyone to create beautiful and insightful visualizations.

This is great news to all data professionals and enthusiasts, as there aren't any first class data platforms that can be used to access, transform, visualize, collaborate and share data at scale and for free!

In this article I will discuss how Data Studio relates to other Google Analytics products (specifically the Google Analytics 360 Suite) and then I will go on to show how to use Data Studio (DS). There are a few areas I will focus on: how to access / transform / manage your data and how to visualize / collaborate / share it. (I know, it's a lot to focus on, but stay with me!)

Data Studio and the Google Analytics platform

You probably heard about the Google Analytics 360 Suite, a platform that will help you evaluate the full customer journey and drive results. The Suite is comprised of 6 products, as schematized below.

Google Analytics 360 Suite

Here is the mission of each of the Suite products:

  • Tag Manager 360 - Data Collection: Get more data and less hassle with powerful APIs and partnerships.
  • Analytics 360 - Digital Analytics: Gain new insight with a total view of the customer experience.
  • Attribution 360 - Marketing Analytics: Discover the true value of all your marketing channels.
  • Optimize 360 - Testing and Personalization: Test and deliver more personal experiences on your site.
  • Audience Center 360 - Audience Analytics: Match the right people with the right message.
  • Data Studio 360 - Data Analysis and Visualization (cherry on top!): Build beautiful and shareable reports, with all your data in one place.

If you are using the free products, you will have access to the current versions of Google Analytics, Tag Manager, and the new Data Studio!

While the first 5 products are about collecting, analyzing and acting on the data, DS is about letting the data speak, uncovering insights through visual exploration. It is a great tool to craft a data story that can resonate with all its consumers.

Google Data Studio Overview

Let's start with what you can do on Data Studio. I like the diagram below, from my colleague Nick Mihailovski, Lead Product Manager for Data Studio. I think it summarizes well a common analysis workflow and the tool capabilities.

Data Analysis workflow

Connect - The first thing you have to do when working with data is making sure you have it! Once you do, check whether any preparation is required (e.g. calculated fields, different formatting, cleaning up) in order to make the data useful.

Visualize - Once the data is ready to go, you will open your canvas and start connecting the dots, beautifying the charts and making sure they tell an insightful story. The cool thing is that now you can collaborate and work across cities or continents in the same way that you already do with Google Docs, Sheets and Slides. Analysts of the world, unite!

Share - And since you are having so much fun and finding so many insights, why not share it with your colleagues? I am sure they will appreciate! Even though we are making sharing simpler, a click of a button, remember that data is a serious business, make sure you think through before you share.

Now that you know what you can do with Data Studio, let's do it, login at https://datastudio.google.com. You will see something similar to the following page.

Data Studio interface

The interface is pretty straightforward. You can choose an account (if you have multiple) in the top right corner; and you can see all, shared, or trashed reports (default page) and Data Sources. Easy peazy.

Let's dive in and look at some cool stuff available through DS. I will start with Data Sources, the interface used to connect your data. Following that, I will discuss the Reports interface, where you will let the artist and businessperson in you go wild :-)

Data Sources: Access, Transform, and Manage

Now to the data… click on Data Sources in the left “sidebar" (see screenshot above). Maybe you will already have some Data Sources in there, maybe you won't. In any case, you will see a “+" sign in the bottom-right corner of your page, click on that to create a new Data Source.

The first choice you have to make is where you are getting the data from: Google Analytics, BigQuery, Sheets, etc. Once you click on one of them, choose among the accounts you have access to and click the “Connect" button. You will get to a screen similar to the following.

Data Sources

  1. Create a calculated field: you can use this to create new metrics based on a formula that transforms one or more existing metrics. There are dozens of operators available, here is a reference list.
  2. Field type: choose the formatting and the type of your metric. Here are the top level types, each has a bunch of options: Numeric, Text, Date & Time, Boolean, Geo.
  3. Field aggregation: choose the aggregation that should be used for your metric. For example, if your metric is a ratio such as Conversion Rate, you should use Average, if it is an absolute value such as sessions, you should use Sum.
  4. Create a report: let the fun begin!

Before we dive into the Reports, I would like to focus a moment on the beauty of how Data Sources work. It is not just the fact that you can bring data from other systems that count, it is also how you can use them. One of DS capabilities I like the most is the fact that you can use Data Sources in three different levels:

  • Report level: The highest level component in the Data Studio inheritance chain. By attaching a Data Source to a Report you will be able to use it across all pages; it is possible to have multiple Sources attached to a Report, but you will choose one as the default, in case a Data Source is not set in the Page or Chart level.
  • Page level: A component of a Report. By setting a Data Source to a Page, you can make it the default to that specific page, even if another Data Source is set as the default in the Report level.
  • Chart level: A graphical representation of data within a Page, the lowest level component in the inheritance chain. The flexibility to set Data Sources to specific Charts has a great advantage when building dashboards for multiple websites, countries, business units or departments.

Now it is time to wear your oldest Marvel (DC is OK too) T-Shirt and roll your sleeves. Let's discuss some visualization capabilities and best practices to start filling your canvas.

Reports: Visualize and Collaborate

After you create a Data Source, you will be given the option to create a Report right from there (see #4 in the screenshot above). But more often than not you will log in to your Data Studio account and create a Report right from the overview page. You will find the “+" on the bottom-right corner, click on it.

The best way to start understanding the Reports interface is by reviewing the excellent map published in the Data Studio Help Center (you will find yourself having this Help Center for breakfast, lunch and dinner, it is an awesome resource!) The descriptions following the chart are a simplified version from the link above.

Reports interface

  1. Click on one of the chart tools to draw a chart in your report
  2. Your canvas, enjoy!
  3. Configure data, settings and styles for any selected component.
  4. Click to switch between edit and view mode.
  5. Click to share this report.
  6. Add text to your report.
  7. Add an image to your report.
  8. Draw a shape in your report.
  9. Add a date range control to your report or a filter control.
  10. Undo and Redo.
  11. Mouseover to see data status and click to update the cache.
  12. Switch between Report pages, organize / add / remove pages from the report.
  13. Back to Homepage.
  14. Click to change the Report name.
  15. Shows who is viewing or editing the report.
  16. Click to manage your Google profile.
  17. Click to send us feedback (do it!)

Wow, that's a lot of fun, and we barely started! And before we dive into best practices, I would like to reinforce #15 above. Once you create a Report you can share it with your colleagues to harness the collective knowledge of your company. This is a really big thing!

Below is an example where I am collaborating with my colleagues Tahir and Lizzie on a Report (they are really skilled data analysts!) As you can see, I am editing the top line chart while Lizzie (pink) is editing the donut chart and Tahir (turquoise) is editing the map. The cool thing is that you can actually see all changes in real time.

Data collaboration

Reports Best Practices

Since there are so many options available, and so many options within those options, I can't possibly go over everything; so I will discuss some best practices I believe are critical to any Report. Data Visualization is an extremely rich world, I will not discuss fonts, colors, shapes, and charts in depth; since so many smart people already wrote about that, I am taking it as a given!

“In many ways, visualization is like cooking. You are the chef, and datasets, geometry, and color are your ingredients. A skilled chef, who knows the process of how to prepare and combine ingredients and plate the cooked food, is likely to prepare a delicious meal. A less skilled cook, who heads to the local freezer section to see what microwave dinners look good, might nuke a less savory meal. Of course, some microwave dinners taste good, but there are a lot that taste bad."

Nathan Yau, Data Points: Visualization That Means Something

I am going to use the Page below as an example to 5 Best Practices I find can be used in a majority of Reports, but YMMV.

Reports best practices

1. Filter controls give power to the users

Filter controls are like coffee with chocolate, they will drive your users forward and offer a rich analysis experience. If you choose the right filters and design them well, analysis will be easier and more effective; there is nothing more frustrating than doing an analysis and being limited by the lack of filters.

“So how should I design my filters?" Said no one ever. But I am glad you asked now! :-)

I always invest some time in understanding the “foundational" dimensions of the Report and each Page separately. Ideally, you would want to have a set of constant filters across all your pages, so that the user can feel more comfortable when looking through the data, but this is not always possible… at least try to keep the same look and feel and some of the same filters. In the screenshot above, I used the same set of 5 filters across multiple Pages (see also #2 below).

In terms of design, I like the “Expandable" option, IMHO it looks nicer, but for short lists it might be useful to have the standard filter. Here is a great video describing this feature in detail.

2. Headers and page dividers are great for organization and consistency

To reinforce the point above on having a consistent experience across pages, I think a header can be very effective on a multi-Page Report. Not only it brings a consistent experience to users, it also informs them what data is available in a specific Page quickly. The header doesn't need to be too complex, maybe a full-width strip with light background (as above). It can also include important messages to users, links to other resources or even a date picker (if you run out of space).

Edward Tufte termed the importance of consistency in design as "economy of perception results":

"(...) once viewers decode and comprehend the design for one slice of data, they have familiar access to data in all the other slices. As our eye moves from one image to the next, this constancy of design allows viewers to focus on changes in information rather than changes in graphical composition. A steady canvas makes for a clearer picture."

Edward Tufte, Envisioning Information

Page dividers can also be highly effectively to separate between different types of content. For example, if you are showing data for 4 different business units in a Page, you might consider having page dividers to make the separation clear. But again, make sure they are consistent across pages.

3. Chart diversity makes the report more engaging

When it comes to the charts themselves, diversity is a very positive factor; a Report containing only tables or only bar charts is a bit boring to look at for too long. Having different chart types makes the analysis more interesting. Of course you should still use line charts for trends, bar charts for group comparison, and tables where the value is important, but try to include some diversity in each Report… even if it means using a Pie Chart!

The Chart Chooser is a good resource on how to choose the type of visualization you need for your data.

4. Color styling helps guiding the eyes

Do not overuse color! Nothing states that more succinctly than Tufte's Data-Ink Ratio:

"A large share of ink on a graphic should present data-information, the ink changing as the data change. Data-ink is the non-erasable core of a graphic, the non-redundant ink arranged in response to variation in the numbers represented."

Edward Tufte, The Visual Display of Quantitative Information

In the example above, I used color to indicate which charts are most important and show the most interesting insights, making the top half of the chart colorful and the bottom half in shades of grey. I think this helps directing the eye. I also used only 2 colors which is usually not enough, but personally I find more than 5 colors hard to read.

5. The Report purpose informs the design

In addition to the above, it is important to remember that the purpose of the visualizations is incredibly important during the conception and creation of your Reports. You must think about the purpose as a whole: what are your users looking for and how to convey it in the best possible way? Stephen Few summarizes this very clearly:

"When you use tables and charts to discover the message in the data, you are performing analysis. When you use them to track information about operational performance, such as the speed or quality of manufacturing, you are engaged in monitoring. When you use them to prepare for the future, such as in budgeting, you are planning. When you use them to pass on to others a message about a business, however, your purpose is communication, no matter what the content. All of these are important uses of tables and graphs, but the process that you engage in and the design principles that you follow differ for each."

Stephen Few, Show Me the Numbers: Designing Tables and Graphs to Enlighten

Sharing is caring: in moderation!

As you will probably notice, Data Studio uses the Google Drive sharing model, which you are hopefully acquainted with. It is important to notice that when you share a Report or Data Source with a person, the access will be given unrelated to whether the person has access to the data in Google Analytics, Sheets, BigQuery, etc. This means that it is extremely important to make sure that the data can be shared with the person.

Sharing data is great, but only when the right people have the right access to the right data.

Let's look at an example.

Data sharing

In the sharing settings above, you will notice that three people have access to the report in question. I am the owner, Tahir can edit and Lizzie can view the Report. You will also notice in the first checkbox at the end of the settings that even though Tahir can edit, he will not be able to add new people to it. Also note that I can disable the options to download, print, and copy for commenters and viewers (in this case Lizzie).

Closing Thoughts

Wow.

I have been playing with Data Studio for a while now, and as you can see I am pretty excited about it. I think this is a great opportunity to bring the Analytics community one step higher, improving the data reporting and visualization standards. Through Data Studio we will be able to broaden the horizons of our community, bringing more and more people into the data world. If you got this far, I am guessing you are quite excited about the Data Studio and the industry in general.

Data Visualization reading suggestions

Happy visualizing! ;-)

image 
Google Analytics 360 Suite
Data Analysis workflow
Data Studio interface
Data Sources
Reports interface
Data collaboration
Reports best practices
Filter design
Data sharing

Empowering Google Analytics with Google Data Studio

$
0
0

You might have heard a few weeks ago that the Google Analytics team launched Data Studio (DS), a robust platform for reporting and visualising data. If you missed that piece of news, take a look at this step-by-step guide - it's a great starting point.

DS can be used to visualise a number of different Data Sources, including AdWords, BigQuery, Google Sheets, and others. Whilst all of these bring considerable benefits, in this article I will take a deeper look into how Data Studio can empower your Google Analytics data and focus on how to do the following:

  • Bring multiple GA accounts together in one place
  • Highlight data for non GA users
  • Customise and brand your report to your own style guidelines

Connecting your data

The first thing you'll need to do when visualising GA data in Data Studio is to create a new Data Source. Below are the steps you should follow:

  1. Navigate to the Data Source section of Data Studio
  2. Press the + button (bottom right corner of the screen)
  3. Select the Google Analytics connector
  4. Opt for the account that you're interested in visualising - this will display all the properties that sit within the account
  5. Select a property - this will display all views that sit within that property
  6. Select a view
  7. Press "Connect to data"

On completing step 7, this will display the data schema of your GA data (see screenshot below). You'll recognise all the GA dimensions and metrics that you know and love from the GA interface and you should also see familiar custom dimensions and custom metrics that you may have created within your GA property.

Connecting Google Analytics to Data Studio

One last thing to do to keep your dashboarding nice and tidy - give your Data Source a name. In this instance, this is my Android app view, so I'm going to label this accordingly.

Congratulations, you've now successfully created a GA Data Source!

Now seems as good a time as any to note what I think is one of the best things about using Data Studio (and this is especially useful for those of you who are GA 360 users): the GA connector in Data Studio provides you with the same level of processing and sampling as you would see in the GA interface. So for those of you who have come up against lower sampling thresholds when using the API to visualise data, this will be huge news!

Creating a report

Now we have a GA Data Source all configured and ready to go, we can start visualising the data. We left off looking at the Data Source schema. From here press "Create Report" - this opens up a new tab in the Report part of Data Studio and you'll be asked if you want to connect this Data Source to the report. Click "Add To Report".

Creating Google Analytics report

You are now presented with a blank canvas on which to start painting your GA picture. Let's say you want to draw out a report that looks like the Audience Overview report in GA.

  1. Add a time series chart and choose your metrics (plot more than one series, if you like)
  2. Add a geo map to show where all your users are located
  3. Add something that you are personally interested in – something that you can't add into the Audience Overview report yourself
    • Do you like to get a quick glance at total screen views? Then add a table with screen as your dimension and screen views as your metric
    • Is it important that you see how many unique events are firing for a particular event? Add a scorecard that looks at unique events for that one event category

As you can see below, we've created something really quite bespoke and we haven't spent much time doing so - this example below took less than a couple of minutes to build. And moreover, it looks smart, it‘s aesthetically pleasing and it's personalised to visualise the data that's most important to you.

Data Studio example

For those seasoned users of GA, you are probably thinking "so far Data Studio looks good, but how is this different to the dashboard section of the GA interface?". Indeed, dashboarding in GA is a great feature and one that is highly adopted (see some examples). It allows you to highlight key figures very quickly and the ability to share these with all your users within the GA account is also brilliant... create a dashboard that your CEO will be interested in and share with them - they won't need to spend time digging around in the interface thus enabling them to grasp key points quickly.

Data Studio certainly owns similar features, but it extends the capabilities of the dashboard feature within GA. I'm not sure which of my points below is more important but I'll start with one that I can, personally, spend a lot of time on...

Styling your report

Using the control panel on the right hand side, you can choose to customise and style your reports pretty extensively. You can modify the theme of the page - change the background colour/add banners/choose font - and generally overhaul the blank canvas to your brand's style guidelines. But you can then get a lot more specific and granular about the style details... select each individual chart in turn and modify the colour of each series, change the colour of the text or move the chart legend around. There are plenty of options to choose from when styling your report, I'll just highlight a few of my favourite things to do when customising:

  1. Add a "featured" box (with curved edges)
  2. Add a logo or photo
  3. Simplify the look of time series charts by removing grid lines (and even axes)

Data reporting style

Adding multiple GA accounts (or other Data Sources) to a single report

We're now going to look at something which will hopefully prove extremely useful to a lot of data analysts. You'll need to create a new Data Source to do this, so go ahead and follow steps 1-7 used previously to do so. This time I've connected to my iOS GA view (we connected to an Android GA view last time). By creating this second Data Source and attaching it to the same report you've already created, you will be able to visualise data from two different Data Sources in one place. Finally(!) you can compare data across GA accounts in a really simple and easy way. Gone are the days of having two different browser windows open with one account open on each. You can now view a completely separate GA account in one, centralised location.

For this, I think we actually want to see a like-for-like comparison of the data across the two accounts so we're going to resize our current components and move them over to the left hand side of the page. Once settled there, we can then copy and paste them and move the duplicates to the right hand side of the page. Now we can go about changing the underlying Data Source of each chart. By selecting each chart in turn, we can use the properties panel to edit the Data Source being used. If we switch from the Android view to the iOS view, we can immediately see the chart adjust to using this new dataset. Isn't it great that you can actually copy/paste charts and just change the Data Source?!

Multiple Data Sources

Once we've changed each of the Data Sources, we can then very simply stand back and look at the differences between the two apps. Is one performing better than the other? Why is that? Is there an event that's not firing properly? Is there a particular screen where users are dropping off? Comparing these views side by side couldn't be easier in Data Studio, and why stop there? You're not just limited to comparing GA accounts – you can add in data straight from AdWords (using the AdWords connector) or add some data you've manually pulled together into a Google Sheet.

Sharing your reports

There is one last thing I want to mention before Daniel tells me this piece is too long for his website ;) It's an important one, so very much worth spending some time on.

As mentioned in Daniel's introductory post on Data Studio, sharing reports is really simple. Using the same functionality you see within Google Docs and Google Sheets, you just press the "Share" button to allow other users to either "View" or "Edit" a report. In addition to these options, there is another way to enable (or limit) users as to what they can see.

Owner's vs Viewer's credentials

In the process I've noted above, there was one option I missed when setting up the Android and iOS Data Sources. When you get to the schema stage, and before you jump into creating a report, you can select whether you give users "Owner's credentials" or "Viewer's credentials".

Data Studio Owner Viewer credentials

If we opt for the "Owner's credentials" option, this means that the report accesses and visualises the data based on the credentials that are possessed by the owner of the report. So, I have Admin access to the GA account where the Android data is being pulled from. Daniel, however, doesn't have access to this GA account at all. By setting the Data Source option to "Owner's credentials", when I give him View or Edit access to the final report I've created, he will be able to see the visualisations in Data Studio. He can filter and query the data to his heart's content, even though he doesn't have explicit access within the GA account itself.

However, if I set the Data Source to have "Viewer's credentials", the user of the end report would have to use their own credentials in order to see the data. In other words, in the situation where Daniel doesn't have access to the Android view, he won't be able to see the data in the report I've built. Some error messages will pop up and he won't be able to filter or query any of the data.

Using these options allows you to share or hide data from those who you wish to have access. These selections are up to you to depict and you can vary them based on the access you wish users to have. Ultimately, this is all about the security of your data and DS gives you a number of options to choose from to ensure your data is only seen by those you choose.

Closing Thoughts

Thanks for getting this far! Hopefully you've learned something new and now feel empowered to go and start building out some lovely looking reports in Data Studio. My goal was to show you just how simple it is to connect to GA and from there, indicate how to really make the most of Data Studio's fantastic features.

Combining multiple Data Sources in one dashboard is something a lot of data analysts need and Data Studio makes this as simple (and quick) as it can be. Now add in the ability to customise and brand your reports and it won't be long before everyone is asking you to create a report in DS for them. Finally, Data Studio doesn't forget to ensure all your data is kept away safely, giving you multiple options to decide who views your data and what they can do with it.

Happy dashboarding!

image 
Connecting Google Analytics to Data Studio
Creating Google Analytics report
Data Studio example
Data reporting style
Multiple Data Sources
Data Studio Owner Viewer credentials

Google Analytics 360 & DFP Audience Sharing

$
0
0
Google Analytics 360 & DFP Integration

A few months ago I wrote about two new Google Analytics 360 (GA360) integrations for ad supported websites: DoubleClick for Publishers (DFP) and DoubleClick Ad Exchange (AdX). As I said then, I believe they are major game changers, they provide a robust solution to measure and optimize ad supported websites. I still believe that, even more so!

In a nutshell, the integrations brought two great improvements at that point:

  1. Data accuracy and completeness: a user that left the website through a click on a DFP or AdX unit, in the past, was considered a simple abandonment, but with the integration they are "seen" as ads clicked. This also allows a multitude of new analyses using metrics that couldn’t be merged before.
  2. Reporting: having all the data in one centralized place can save a lot of time. The GA360 interface can be used to create custom reports, dashboards and emails.

But since my last article, a few important things changed in the product. Last week, the GA360 team released an outstanding case study (link to PDF) discussing how AccuWeather delivers enhanced value to advertisers with DoubleClick for Publishers and Google Analytics 360. Below is a descriptive scheme shared in the case study.

Google Analytics 360 and DFP integration

In this article I will discuss an important development in the DFP & Google Analytics 360 integration: the Audience Sharing feature (beta) that allows publishers to share Google Analytics 360 Audiences with DFP bringing a series of benefits.

Google Analytics 360 Audience Sharing BETA

Besides the reporting capabilities already discussed in my previous article, the DFP integration enables deeper optimization opportunities with the Audience Sharing feature (beta), a way for publishers to share audiences they created using Google Analytics 360 data directly into DFP. These Audiences can then be used to target users that performed a specific task, read a specific type of content, came from a specific campaign, or any other information available on Google Analytics. You can do that either by building a segment on Google Analytics and building an audience out of it or by directly creating an audience and sharing it with DFP.

Below I discuss two use cases for this feature: optimizing ad serving by not showing some ads to some users (decrease impression waste) and providing better targeting based on user behavior (optimizing targeting).

1. Decreasing impression waste

It is very common to use DFP to serve house ads, which are intended to promote an action inside your website (as opposed to promoting an advertiser); this could be, for example, a registration for a membership or a page where you are trying to sell something. For Online Behavior, I used house ads to promote my book, showing an ad unit below every post on the website.

However, if a user had already visited the book page and clicked on one of the links to purchase it, I was wasting those impressions, and it would be more profitable for me to show Backfilled AdX ads instead of the book promotion to that group of users. Easy peasy!

The first step was to create an audience on Google Analytics including all users that have completed a goal of clicking on one of the book links on that page. Note that in the first step in the screenshot below I chose to share this audience with my DFP account.

Google Analytics 360 Audience

Once I finished creating this audience, I went on to DFP and edited my Book campaign line item to include a targeting criteria as shown in the screenshot below: Audience Segment is not Book Viewers.

DFP Targeting

Voila! Users that clicked on the book links didn’t see the ads anymore, they saw AdX Backfilled ads, and that helped raising my revenue :-)

2. Optimizing targeting

You might also go the other way around.

Suppose you have a paid subscription along with content you provide for free to your readers. And suppose that you are currently using Google Analytics to measure those subscription transactions (you might as well use a Goal). That means you could identify which of your users are starting but not completing your subscription process. You could then create an Audience of all those users and remarket to them using house ads on your own website to try and engage them back with the funnel on future sessions. This would follow the same process described above.

Or suppose you are selling inventory to advertisers that are interested in people that care about sports. One simple technique would be to show the ads only on sports pages. However, some of your users that are interested in sports might visit the website sometimes and only look at the news section, but they are also interested in sports, just not on the current session. With Google Analytics Audiences, you could save “interests” across sessions, meaning that a person that is interested in sports will be part of a sports audience even if they don’t see a sports page in the current session. That would increase the amount of impressions you have available for sports fans.

Here is a similar example from the AccuWeather case study:

The integration between DFP and Analytics 360 is helping AccuWeather advertisers in other ways. For instance, one of its advertisers, a health related consumer product, wanted to survey users who had seen its ads on AccuWeather’s website. AccuWeather used Analytics 360 data to build a custom audience, blending those who had been exposed to that company’s ads on its website with location data to reach the right users.

AccuWeather shared this audience with its DFP account, which delivered the survey to that select audience. That’s how the advertiser learned that those who saw its ad on AccuWeather.com were actually 6.5 times more likely than the typical user to buy its product within the next 30 days. It’s not too surprising that this advertiser is making additional ad buys with AccuWeather this year.

Last, but not least, suppose you have sections of your website which are not very good at engaging your returning users, such as news articles. As in the previous paragraph, if you have different audiences for different interests, you could use DFP on the news articles to engage your users by showing them an interesting post based on their interests, this would keep them engaged with the website by providing them with targeted content.

If you are a Publisher, I am sure you are super excited about the opportunities this tight integration brings to your business. Happy analyzing / optimizing :-)

image 
Google Analytics 360 and DFP integration
Google Analytics 360 Audience
DFP Targeting

A Happy Marriage Between Google Analytics & iFrames

$
0
0
Google Analytics & iFrames

An iframe is a web page embedded in another page. It's handy little hack that lets you include content from another page in your pages.

I bet you've visited many websites that use iframes and you probably didn't know it. You might even have iframes on your site without you being aware of what they are, how they work or how they might affect your Google Analytics data.

Before I go deeper into the GA issues an iframe could cause, it is important to make sure everyone understands what an iframe is and what it does. In this article I will explore some common use cases for iframes and discuss how they can be used well without wreaking havoc on your GA data.

iframes loading content from the same domain

Lets use an iframe to load Daniel's biography on this page:

We're loading content from online-behavior.com on this page which is also on online-behavior.com. This seems pretty trivial but actually, it's making Daniel's GA data do some unexpected things.

What happens to Online Behavior's GA data?

When Daniel checks his GA data he'll see two pageviews. Unfortunately he'll see very low values for the time on page and bounce rate metrics for this page...

As this page loads and sends a pageview hit to GA, the very next thing that happens is that the browser displays the iframe. The iframe loads Daniel's bio page, which in turn sends a pageview hit to GA too. As far as GA is concerned, this looks like you clicked on this article and then very quickly clicked through to Daniel's bio. GA doesn't know that the bio is loaded in an iframe. This means the time on page for this page is crazy low and it's also pretty impossible to bounce from this page as we're forcing two pageviews to fire in a very small space of time.

This is not great as the data isn't a true reflection of user behaviour. You may have spent a few minutes reading this far (thanks!) but GA doesn't show this - it shows a time on page of 1 second or less.

What's the fix?

Typical Analyst answer - it depends. In this instance, it depends what Daniel wants to have happen to his data. You need to decide what's right for your data. Let's explore some options.

You could take page tracking off the iframe. The metrics for this page will be restored but what about the tracking on Daniel's biography?

What if this wasn't a trivial example and the iframe contained a form? How's that going to be tracked?

You could filter the iframe pages into a view for conversion tracking but make sure you know whence the user came when attributing conversions to channels. The solution to preserve your data quickly gets very complex as is normal with an 'it depends' type answer.

The actual fix is to not do this in the first place. Your data is compromised as soon as you decide to compromise the page implementation using a iframe.

If you have key content, like a form, that you want on the page, build it properly and track it properly. If it's key to your business, making the effort to build your pages right is a sound investment that will deliver a return.

iframes loading content from a Subdomain

Loading the content in the iframe from a subdomain is just as bad as loading the content from the same domain. It's a compromise. Don't do it.

iframes loading content from a 3rd party domain

You've probably seen this scenario a lot. Think about sites where you've seen links to news stories with "Recommended by Outbrain" or "Sponsored content by Taboola" on news pages. It's a common content syndication solution. Most sites showing adverts will display the ad in an iframe. The ad is chosen by another system and displayed from a 3rd party ad server.

These use cases are good examples of where iframe behaviour is desirable. Browser security settings (The Same Origin policy) prevent 3rd party iframe content from interacting with the outer page. This is a good thing for the safety of users.

Whilst this case is well understood and a correct use of the technology, this is actually where you need to be properly terrified. Okay, that's a little strong but you do need to scrutinise the iframe behaviour very carefully.

Content syndication

So you've displayed clickbait content syndication on your site. If you're syndicating 3rd party content to monetise your site, your data will not be affected. If you're using Taboola or Outbrain to syndicate your own content you need to check your data and GA configuration.

If you see an increase in sessions and a drop in the average session duration then you need to take a look at the referrers report. Do you see Outbrain or Taboola in there? Yes? Make this change to the Referral Exclusion List (found in GA property Admin under Tracking Info):

Referral Exclusion List

This prevents session breakage. All referrals in Universal Analytics create a new session. Clicks on content syndication links to your site should not create a new session. Adding these domains (and their subdomains) to the referral exclusion list stops new sessions from starting and preserves the original traffic source.

If you've seen referrals from these domains and you're syndicating your own content, through these networks, adding the domains to the referral exclusion list now won't fix your data straight away because of the default 6 month campaign timeout. Read more about referral exclusions at https://support.google.com/analytics/answer/2795830

You could patch your data using a modified channel grouping such as this:

Modified channel grouping

You won't see the Taboola or Outbrain referrals in your historic data but it won't fix the session breakage - it's a patch. The best bet is to get the content syndication domains in the referral exclusion list before using them on your site.

3rd Party Ad Serving

Okay, say your site is a publishing site rendering an ad. Maybe the ad click is heading off to another 3rd party. The advertising tech probably handles the ad serving and click measurement. No worries.

However, there are two scenarios worth considering.

1. The ad click takes users to your site

DoubleClick ads are rendered in iframes. If you're rendering "internal" banners to drive promotions on your site (sometimes called House Ads), the ad clicks will end up on your site.

Fine, you're using something smart like DoubleClick for Publishers (DFP) so your measurement (assuming you've setup the tool correctly) will be unaffected (read more about the GA360+DFP integration).

Check your data if you use alternative marketing software to render your house ads. Make sure your precious channel data is not polluted by referrals from the iframe 3rd party domain.

2. The iframe loads resources from your site

What if you're rendering rich media ads and/or there is some communication between advertiser and publisher content? You might use the DoubleClick Rich Media iframe approach.

If these instructions aren't followed and DARTiframe.html doesn't exist or can't be found then you'll end up with a third party requesting content from your domain on the user's behalf resulting in a 404 error and session data breakage.

You'll know you're in trouble when you see a step change drop in your session duration and an increase in referrals from tcp.googlesyndication.com. You may also see the inflation in requests for DARTiframe.html (with tcp.googlesyndication.com as the referrer) if you're tracking 404 errors. There is a wealth of documentation out in the wild on the internet that might suggest your ad autotagging is incorrect. This will lead you down a false path. Look for DARTIframe.html requests to confirm ad autotagging is not the issue.

Closing summary

Used properly, iframes are excellent tools to render 3rd party content safely. Used badly, the user experience suffers, your data becomes meaningless and your site becomes a rat nest of hacks and make-do compromises.

Choose your markup carefully and be aware of what the possible effects are on your data.

In a fast moving, complex organisation you may not have 100% control over all pages and all campaign implementations. Maintain visibility of your data and prepare to act if you see symptoms of pollution, reduced data quality and unexpected step changes in key metrics.

image 
Online Behavior
Modified channel grouping

Fixing Five Common Google Analytics Setup Mistakes

$
0
0
How to Fix Five Common Google Analytics Setup Mistakes

Google Analytics can help your business flourish, but only if you get the setup right. Who wants to take decisions on incomplete or inaccurate data? The first, crucial part is getting the setup right to implement and configure Google Analytics in line with your business objectives.

In this post I will touch upon a few key mistakes that I found often when doing Google Analytics setup audits. And I will provide clear solutions for each of them. I know your business is different from every other business out there. But there are some general, solid guidelines that apply to every website and business that you should follow.

1. Tracking Code Incorrect or Incomplete

Tracking issues are on top of my issues list! Your tracking has to be properly installed, otherwise you might miss out on capturing key data on your site. Incomplete and/or incorrect data can lead to bad marketing decisions that hurt your business. So if your data isn't right, you could better refrain from making data-driven decisions at all!

Independent of which web analytics tool you use, you should invest the time and knowledge to get your setup right. For demonstration purposes I focus on Google Analytics in this article.

Tools for Checking Your Implementation

There are different tools out there to debug your implementation and ensure everything is working properly. I will elaborate on two tools that I often use:

  • Tag Assistant: perfect fit for detailed debugging and problem solving on the "page" level.
  • Screaming Frog SEO Spider: perfect fit for debugging on the website level across all pages).

Tag Assistant

Google Tag Assistant

Tag Assistant is a Chrome extension that can be used to create, validate, diagnose, and troubleshoot your Analytics data on each of your pages. Once you create a recording and detect a problem, you can check again to verify whether your tags are firing correctly following a fix. I usually debug specific issues with Tag Assistant, but don't use this plugin for a broad code scan on my website.

Learn more about it in the Google Analytics Help Center.

Screaming Frog SEO Spider

Screaming Frog SEO Spider is another great tool to find out whether your tracking code is correctly installed on all of your pages. There are two versions available: FREE, for websites with 500 URLs max, and Paid (£149) for more than 500 URLs.

This is how it works:

  1. Install Screaming Frog SEO Spider.
  2. Navigate to "Custom" section and select "Search".
  3. Create two filters, either based on your GTM snippet or UA code (hardcoded). You need to set up both a "contain" and "does not contain" filter to more easily interpret the results later.
  4. Start the crawl.
  5. Review the crawl results in the "custom" section.

Apparently there is one URL that doesn't contain the GTM container snippet: online-metrics.com/cheat-sheet. This is because I have implemented Google Analytics hardcoded in that page instead of using the GTM snippet.

Other solutions for debugging your setup:

Tracking code: Areas to Explore

At the very minimum you should check three things with regards to the tracking code installed.

Tracking code version

If you haven't done already, make sure to upgrade to Universal Analytics. Tag Assistant will show which version you're using and whether you have to migrate or not.

Code placement

Where to place your tracking code depends on whether you use Google Tag Manager or not. If you, implement the GTM container tag just after the opening body tag.If you use the hardcoded GA script, implement it right before the closing head tag.

Cross-Domain Tracking

You need to implement and configure cross-domain tracking if the user journey for your brand spans across multiple domains.

You will lose the original referral information of your visitor if you don't get this right. A new session is started when the visitor moves from one domain to the other.

I highly recommend to use Google Tag Manager when setting up cross-domain tracking, as explained in this article.

Note: for all new implementations I recommend to use Google Tag Manager. It's more flexible in configuring Google Analytics to your needs. Further it helps you out with other tracking needs as well and it lessens your dependence on IT resources.

Getting your code implementation right (based on your measurement plan) is a crucial step in deriving actionable insights from reliable Google Analytics data. Get this wrong and all your next steps won't make sense.

2. No Goals Setup or Wrongly Implemented

Google Analytics goals are the backbone of your analysis and optimization efforts. Without them you can't tell whether your website is performing well or not and where you need to improve.

Very often I come across accounts where the goal setup is really a mess. In the example below there are six goals that are incorrectly set up. Only two goals are collecting data.

Wrong goals

Solution

  • Start out with a thorough measurement plan.
  • Make sure that your Google Analytics goals are aligned with your KPIs and business objectives.
  • Create a logical structure for your goal setup.
  • Make sure to tie goal values to your goals. Exception here is when you run an ecommerce site and have (enhanced) ecommerce implemented. In this case the ecommerce revenue is your go-to value metric!

Well-defined goals allow you to correlate all data in Google Analytics with your most important visitor characteristics (dimensions). E.g. you can review and optimize the value of each of your (landing) pages, channels and devices.

3. No Backup View

By default, Google Analytics allows you to set up 100 accounts, 50 properties (per account) and 25 views (per property). Most often I encounter either one of these two setups, which unfortunately are both wrong.

  • Clean, untouched Google Analytics account with just one view (All Web Site Data): lack of view(s) with correctly implemented goals and right set of filters for accurate measurements.
  • Google Analytics account with many incorrect defined properties and views: lack of raw data (untouched) view.

Solution

Always set up a raw data view. It's a "rescue" view in case things go wrong. This in addition to a testing view (only includes your traffic) and a master view where the appropriate filters are applied.

Read this article if you want to learn more about setting up different views in Google Analytics.

No matter how experienced you are, you need to have a backup view in place. Quite often, there are many people working on a Google Analytics account with "edit" access to certain views. Ensure that the raw data view is left untouched by EVERYBODY.

4. Not Integrating Google Analytics with Other Products

Google Analytics provides great integrations with a bunch of other tools. And you should use these integrations to your advantage! There are two basic integrations that are a must for every website owner.

AdWords linking

Everybody should set up an AdWords account. You might not spend thousands of dollars on advertising from day one, but still you want to use their "keyword research" tool which is provided for free.

Linking Google Analytics and Google AdWords is really easy today. By doing so, you'll see a ton of useful AdWords data in Google Analytics. In addition, you can import Google Analytics goals into AdWords and more effectively work with remarketing lists.

Search Console linking

A couple of months ago Google announced a deeper integration between Search Console and Google Analytics. In short, by integrating Search Console and Google Analytics you'll get useful Google organic search data directly available in Google Analytics.

There are a ton of strategies you can apply to derive actionable insights from your search console data.

This is just the start. Do a search on Google and you will find a lot of other integrations you can set up. Integrate other tools that you use with Google Analytics in order to gain incredible insights which you can use to skyrocket your business.

5. Working with Unclean Data

Your analysis can only be as good as the data that feeds it. It doesn't matter whether you run a small lead generation website or a large e-commerce business. You should always clean up your data!

Here are four tips to get more reliable, clean data.

Tip 1: Use Filters

Earlier in this article I wrote about setting up at least three different views in each of your Google Analytics properties:

  1. Raw data (rescue) view.
  2. Testing view with only your traffic included.
  3. Master view with relevant filters applied.

At a minimum I recommend setting up the following filters in your master view:

  • Exclude filter on your own and other internal IP addresses.
  • Lowercase filters on campaign parameters, hostname, request URI and search term.
  • Hostname filter on your domain(s) that you want to gather data from.

Read this extensive filters guide if you want to learn more about Google Analytics filters and how to set them up.

Tip 2: Use Campaign Tagging

By default Google Analytics correctly measures four different traffic types:

  • Direct traffic.
  • Organic traffic.
  • Referrals.
  • CPC (AdWords) traffic - only if you have correctly integrated AdWords with Analytics.

But what if you run affiliate or email campaigns? In this case you need to leverage the campaign tracking feature of Google Analytics.

The URL builder will help you out with planning your campaign URLs. In addition, Annie has done a terrific job in putting this campaign tracking guide together. I highly recommend to check it out!

Note: your direct traffic and referral numbers are usually inflated if you don't get campaign tracking right. This will drive bad business decisions so make sure to get this part right!

Tip 3: Filter Out Bot / Spam Traffic

You will already rule out a lot of "bot" and "spam" traffic by setting up an "include hostname" filter, which will prevent (not set) hostname traffic to show up in your account.

In addition you should tick the following box in your Google Analytics view settings:

Google Analytics bots and spiders

Good news is that Google is doing a lot in the background as well to prevent "spam" traffic from appearing in your account.

Tip 4: Exclude Technical Query Parameters

With regards to Google Analytics, you can distinguish the query parameters used in your site between Technical & Analysis/Marketing query parametersters.

The first group consists of parameters that don't contain any value in your analysis. For example: www.buildagreatwebsite.com/?sessionid=123456789. Not removing the sessionid parameter will lead to duplicate versions of the same page in Google Analytics. If not handled in a proper way, your content reports might contain dozens of URL versions that should be grouped under the same URL.

The second group consists of analysis query parameters. These parameters should not be filtered out of your data. For example: www.buildagreatwebsite.com/form/?submit=ok. In this case you remove valuable data if you filter out the submit query parameter.

In short, make sure to add the technical query parameters to your Google Analytics view settings:

Query parameter filtering

By excluding query parameters such as session ids or other technical parameters, you will de-duplicate your content reports and make them much more useful and easy to analyze.

This is it from my side. I hope you have picked up†a few new ideas†here. We very much appreciate a share if you like the article!

image 
Google Tag Assistant
Wrong goals
Google Analytics Bots
Query parameter filtering
Online Behavior

Interactive Dashboards with Data Studio & Bigquery

$
0
0
Interactive Dashboards with Data Studio & Bigquery

The US Federal Elections Commission (FEC) has been publishing Political Campaign Finance data for years, but they haven't made it easy to explore their data. Luckily, Google Data Studio and BigQuery are making it easier than ever to explore large datasets like the ones published by the FEC.

In this article, you will learn how to use Google's latest tools to create your own dashboards, using an example I created to visualize 2016 Election Cycle Donations, exploring public FEC data curated by OpenSecrets.org.

Since there are already a bunch of tutorials for creating charts and filters in Data Studio, we are going to take a small shortcut by making a copy of the original 2016 Election Cycle Donations dashboard, but we are not totally cheating because we have a fair amount of other stuff to do. Here's a quick overview of the game plan:

  1. Create a Bigquery account (3 Minutes)
  2. Add the public FEC dataset to your Bigquery account (30 Seconds)
  3. Make a copy of the 2016 Election Cycle Donations Dashboard (30 Seconds)
  4. Create Custom Dimensions and Metrics (5 Minutes)
  5. Fix Broken Dashboard Widgets (3 Minutes)
  6. Explore the data!

1. Create a Bigquery account

Skip to step 2 if you already have a Bigquery account.

You can create a Bigquery account by clicking here. This link is actually a special link that will guide you through the process of creating a Bigquery account, and it will automatically add the FEC data you will need in the next step. Once you create your new Bigquery account, you can skip to step III.

2. Add the public FEC dataset to your Bigquery account

Skip to step 3 if you just created a Bigquery account using the link in Step 1.

If you already have a Bigquery account, you can click this link to add the public FEC dataset to your account. Once the fh-bigquery dataset has been added successfully, you will see it in the sidebar navigation menu.

BigQuery sidebar

3. Make a copy of the 2016 Election Cycle Donations Dashboard

Return to the 2016 Election Cycle Donations dashboard and click File > Make a Copy > Create New Data Source

Data Studio new data source

Then click Create a New Data Source and select Bigquery > Shared Projects > enter fh-bigquery under Shared project name> opensecrets > indivis16v2 > your Bigquery project of choice > Connect.

BigQuery data source

Click the Add to report button on the next screen (we will update columns later)

Connect data source

The indivis16v2 BigQuery dataset is now connected so you can click Create Report on the next screen.

Data source connection

4. Create Custom Dimensions and Metrics

The copied version of the 2016 Election Cycle Donations dashboard will show many broken widgets at first. This is because the original version was using calculated fields that Data Studio wasn't able to copy. We're going to re-create the fields used in the original Dashboard so we can fix the broken widgets. This part looks like a lot of steps, but it is much quicker and easier than it looks. All we are really doing here is adding new columns and then pasting each name and/or formula we need to define those columns. In some cases, we will just rename the columns instead of adding new ones.

Calculated fields

Create new Custom Dimension for Political Affiliation

  • Name: Political Affiliation
  • Formula: CASE WHEN party = 'D' THEN 'Democrat' WHEN party = 'R' THEN 'Republican' WHEN party = 'L' THEN 'Labor Union' WHEN party = 'I' THEN 'Individual' WHEN party = 'C' THEN 'Corporation' WHEN party = '3' THEN 'Other' ELSE 'Not Provided' END

Create new Custom Dimension for Donor Gender

  • Name: Donor Gender
  • Formula: CASE WHEN REGEXP_MATCH(gender, "M|m") THEN "Men" WHEN REGEXP_MATCH(gender, "f|F") THEN "Women" ELSE "Not defined" END

Create new Custom Dimension for Donor Occupation

  • Name: Donor Occupation
  • Formula: UPPER(occupation)

Create new Custom Metric for Total Donations

  • Name: Total Donations
  • Formula: COUNT(fectransid)

Create new Custom Metric for Total Donors

  • Name: Total Donors
  • Formula: COUNT_DISTINCT(contribid)

Create new Custom Metric for Avgerage Donations

This metric uses the custom metrics Total Donations and Total Donors, so those must be created first.

  • Name: Avg. Donations
  • Formula: Total Donations / Total Donors

Create new Custom Metric for Avg. Amount

  • Name: Avg. Amount
  • Formula: amount / Total Donations
  • Create column and change to currency format

Create new Custom Metric for Total Donation Recipients

  • Name: Total Donation Recipients
  • Formula: COUNT_DISTINCT(recipid)

Renaming default columns for more clarity

The following steps do not require custom formulas, we are just going to update the names of a few columns.

  • Rename "contrib" to "Donor"
  • Rename "recipients" to "Donation Recipients"
  • Rename "orgname" to "Donor Employer"
  • Rename "state" to "Region" and change format to Geo > Region
  • Rename "city" to "City" and change format to Geo > City
  • Rename "amount" to "Total Amount" and change to currency format

5. Fix Broken Dashboard Widgets

Now that we've re-created the custom columns that were used in the original dashboard, we just need to add them to the broken widgets and we are done. Luckily, any Report Level widget just need a single update to work across every page but there are a few Page Level widgets that need to be updated on each page.

Fix Filters (Report Level widgets)

Let's assign the following dimensions to each filter so the filters' names match the layout of the original Dashboard. Use "Total Amount" as the metric for each filter.

Data Studio filters

  • Filter 1: Donation Recipients
  • Filter 2: Political Affiliation
  • Filter 3: Donor Gender
  • Filter 4: Donor Occupation
  • Filter 5: Employer Donor
  • Filter 6: Region

Fix Charts (Report Level widgets)

Update the 3 charts listed below by assigning the following dimensions to each corresponding widget.

Data Studio charts

Stacked bar chart

  • Add Political Affiliation as first dimension
  • Add Donor Gender as secondary dimension
  • Use "Total Amount" for metric

Time Series chart

  • Add "Total Donations" as first metric
  • Add "Total Donors" as second metric
  • Add "Total Amount" as third metric

Region Map Chart

  • Add "Region" Dimension

Fix Scorecards (Report Level widgets)

Replace the invalid metric for each scorecard with the following metrics.

Data Studio scorecards

  • Total Donation Recipients
  • Total Donors
  • Total Donations
  • Avg. Donations
  • Avg. Amount
  • Total Amount

Fix Table Widgets

Unlike the report level widgets we just fixed, the Table widgets on each page contain a unique set of columns. The dimensions and metrics for each Table are listed below.

Data Studio tables

Table on Page 1: Recipients

  • Dimensions: Donation Recipient, Political Affiliation
  • Metrics: Total Donors, Total Donations, Avg. Donations, Avg. Amount, Total Amount

Table on Page 2: Donors

  • Dimensions: Donor, Donor Occupation, Donor Employer, Political Affiliation, Donation Recipient
  • Metrics: Total Donations, Avg. Amount, Total Amount

Table on Page 3: Political Affiliation

  • Dimensions: Political Affiliation
  • Metrics: Total Donors, Total Donations, Avg. Donations, Avg. Amount, Total Amount

Table on Page 4: Gender

  • Dimensions: Gender
  • Metrics: Total Donors, Total Donations, Avg. Donations, Avg. Amount, Total Amount

Table on Page 5: Occupation

  • Dimensions: Donor Occupation
  • Metrics: Total Donors, Total Donations, Avg. Donations, Avg. Amount, Total Amount

Table on Page 6: Employer

  • Dimensions: Donor Employer, Political Affiliation
  • Metrics: Total Donors, Total Donations, Avg. Donations, Avg. Amount, Total Amount

Table on Page 7: Region

  • Dimensions: Region
  • Metrics: Total Donors, Total Donations, Avg. Donations, Avg. Amount, Total Amount

Table on Page 8: City

  • Dimensions: City
  • Metrics: Total Donors, Total Donations, Avg. Donations, Avg. Amount, Total Amount

6. Explore the data!

That's it! All of the widgets across all of the pages should work now, so you are free to explore or to continue creating new pages and finding new insights.

Data Exploration

When you think about it, we did not have to write a single line of code, we did not have to learn SQL, and we did not have to index a database or learn how to allocate computational resources... we really didn't have to do any of the typical scary stuff that usually comes with large scale data analysis! Thanks to Data Studio and BigQuery, almost anyone can now create fully interactive dashboards to explore datasets of virtually any size, and it can be done in just minutes.

By the way, you may have noticed a bunch of other public datasets in BigQuery while we were adding the fh-bigquery dataset. Well guess what, you can actually create Data Studio dashboards with any of those datasets too. You can even combine multiple datasets by creating a Data Source from a query that joins multiple tables together.

As some parting inspiration I would recommend checking out this post by Felipe Hoffa, a Google Developer Advocate. His video about visualizing big money in politics with Big Data has some particularly great nuggets that you can use to take your dashboards to the next level of the game!

image 
BigQuery sidebar
Data Studio new data source
BigQuery data source
Connect data source
data source connection
Calculated fields
Data Studio filters
Data Studio charts
Data Studio scorecards
Data Studio tables
Data Exploration
Viewing all 87 articles
Browse latest View live