Quantcast
Channel: Online Behavior - Guide
Viewing all 87 articles
Browse latest View live

A Guide To Running Successful A/B Tests

$
0
0
A Guide To Running Successful A/B Tests

"I didn't fail the test, I just found 100 ways to do it wrong." Benjamin Franklin

In marketing, A/B testing is a technique for measuring the effect of web page changes on a performance metric, such as click through Rate, sign-ups, purchases, etc., on all of your visitors or a specific segment. Marketers have been using A/B testing for a long time to improve revenue and conversion rates. Nevertheless, the reality is that badly performed tests may produce invalid results, which may lead to false observations or unnecessary website implementations and changes. To avoid getting false results, we should arm ourselves with a set of ideas and approaches, which protect the validity of the process.

When running an A/B test, using a valid methodology is crucial for our ability to rely on the test results to produce better performance long after the test is over. In other words, we are trying to understand if tested changes directly affect visitors' behavior or occur due to random chance. A/B testing provides a framework that allows us to measure the difference in visitor response between variations and, if detected, establishes statistical significance, and to some extent causation.

The basic methodology, called hypothesis testing, is the same used to prove if medical treatments actually work or if hot weather results in increased ice cream sales. But what makes a successful A/B test? How can we trust the results? The difference between running a successful A/B test or an unsuccessful A/B test relies on the methodology and validity of the data.

Before we dive into more advanced techniques, let's get familiar with the basic terms.

A/B Testing Glossary / Statistical References

Statistical Truth

1. Claim

Usually referred to as hypothesis, a claim is a change and effect statement, that may, or may not be true, (hopefully) based on initial or limited evidence. In A/B testing, the claim is always made about the visitor and his or her reaction to changes. For example: changing the text of the call-to-action button from "Submit" to "Sign up for a FREE trial" will increase sign-up conversion rates.

2. Correlation

A correlation means an associated relationship between the impression of a web page variation and the visitor's reaction to it. This doesn't mean there is a causal connection between the two.

3. Causation

In statistics, the term "causation" stands for a causal relationship between two random variables, when one changes because of the other, with different causation models (or explanations). Unfortunately, although we're able to measure correlation in web page A/B testing, we can't actually deliver a definite proof of causation between a variation change, and a change in user response. There are several possible reasons for that, for example: spurious relationships, where there's an underlying factor involved that wasn't measured. Another possible explanation is that there's always room for random chance.

4. Statistical significance (or confidence level)

In A/B tests, statistical confidence sometimes referred to as "chance to beat original," measures the probability that the difference in the measured performance between the different variations is real and not due to chance alone. A 95-percent confidence level means that there's only a five percent chance that the numbers are off. But even a 99-percent confidence level doesn't necessarily mean that the results are absolutely reliable. It only means that the error rate is much smaller (one percent in this case), and that all models' assumptions are indeed valid. We should also keep in mind that a statistical significance is a function of sample size.

5. Sample size

Sample size represents the number of visitors who have been part of your test. Generally speaking, the larger the sample size, the more reliable the results will be (the more statistical power your test has). That being said, choosing the right testing method, in terms of number of variations (A/B test versus multivariate test with multiple variations) is crucial for obtaining results fast.

Hold Your Horses! Make Sure Your Tests Are Valid

One of the fundamental goals of statistical inference is to be able to make general conclusions based on limited data. When performing web page A/B tests, the scientific phrase "statistically significant" sounds so definitive that many marketers and users of A/B testing solutions rely on it to conclude the observed results of the tests. Sometimes, even a tiny effect can make a huge difference, eventually altering the "significance" of the conclusions. Sticking to a strict set of guidelines will deliver more reliable results, and thus more solid conclusions.

10 Golden Tips For Running Successful A/B Tests

1. High confidence level - Try to get as close to a 99% confidence level as possible in order to minimize the probability of getting the wrong conclusions.

2. Be patient - Don't jump to conclusions too soon, or you'll end up with premature results that can backfire. You know what? Stop peeking at the data as well! Wait until the predefined sample size is reached. Never rush your conversion rate optimization, even if your boss pushes you to get results too fast. If you can't wait and need potentially faster results, I advise you to choose tools that can actually achieve reliable results faster, as a result of a mathematical prediction engine, or a multi-armed bandit approach. That being said, there's no real magic. Be patient.

3. Run continuous or prolonged tests for additional validations - If you don't trust the results and want to rule out any potential errors to the test validity, try running the experiment for a longer period of time. You'll get a larger sample size, which will boost your statistical power.

4. Run an A/A test - Run a test with two identically segmented groups exposed to the same variation. In almost all cases, if one of the variations wins with high statistical confidence, it hints that something technically may be wrong with the test. Most A/B testing platforms use a standard p-value to report statistical confidence with a threshold of 0.05. This threshold is problematic, because when not enough data is collected, these tools may reach statistical significance purely by chance - and too soon (this is often due to the fact that not all assumptions of the model are valid).

5. Get a larger sample size or fewer variations - If you're able to run the test on a larger sample size, you will get higher statistical power, which leads to more accurate and more reliable results. On the other hand, if you're using more than two variations and don't have enough traffic volume for proper, valid results, try reducing the number of variations.

6. Test noticeable changes - Testing minor changes to elements on your site may get you farther away from any statistically significant conclusions. Even if you're running a high-traffic site, test prominent changes.

7. Don't jump into behavioral causation conclusions - As marketers, we often base decisions on our intuition regarding the psychology of the visitor. We believe we know the reason for the visitor's positive/negative reaction to variations. A/B testing comes in to help us rely a bit less on our instincts and a bit more on concrete evidence. Good marketing instincts are useful for creating testing ideas and content variations. The A/B test will take us the extra mile by enabling us to base our instincts on data.

8. Don't believe everything you read - Although reading case studies and peer testing recommendations is great fun, find out what really works for you. Test for yourself. Remember that, sometimes, published statistics tend to be over optimistic, and not representative.

9. Keep your expectations real - More often than not, following the end of a successful A/B test, there's an observed reduction in the performance metrics of the winning variation. This phenomenon is called regression toward the mean, and it is not something that can be quantified and corrected in advance. So, to avoid making the wrong conclusions, lower your expectations once a test is over.

10. Test continuously and never stop thinking and learning - The environment is dynamic. So should be your ideas and thoughts. Evolve and think forward. Remember that the downside of all traditional A/B testing tools is that, eventually, these tools direct you to make static changes to your site, which may or may not fit all of your potential users in the long run. In other words, you're not necessarily maximizing your conversion rates by serving only one winning variation to all of your visitor population or by conducting short-term tests. Some tools (like the one we offer at Dynamic Yield) allow you to personalize the delivery of variations based on machine learning predictions (that require a relatively large amount of data) instead of waiting for one variation to win globally. With these tools, instead of catering your website to the lowest common denominator, you can actually deliver a better user experience by dynamically choosing to display the right variation to the right users.

Closing Thoughts

Let me finish by saying that although I provided some necessary statistical references in this article, I'm not a statistician. I am a Marketer and a Conversion Rate Optimizer. I hope the ideas reflected in this article will help you improve your A/B testing techniques for better decision making and conversion rates.

image 
Statistical Truth

Google Analytics Segments Best Practices

$
0
0
Google Analytics Segments Best Practices

Are you struggling with Google Analytics segments every now and then? Would you like to learn how to use and build them instead of just being overwhelmed with a "best segments" posts?

If your answer to these questions is yes, I recommend you read this article very carefully! I have encountered many people that don't know what's really behind these segments, and that's why I have created this list of 8 important things you should know about segments in Google Analytics.

Grab your cup of coffee or tea and carefully read through the following list.

1. There are built-in and custom segments

At the time of writing this article there are 22 built-in segments in Google Analytics. They help you to get started. Here is a quick overview of them:

Google Analytics Segments

This is definitely a great start to put your data in relevant context. However, you can complement your insights by working with custom segments.

Custom Segments

You have the option to segment on:

  • Demographics: age, gender and other demographical factors
  • Technology: browser, screen resolution, mobile device and other technological factors
  • Behavior: # of sessions, transactions per user/visit/hit, all related to particular behavior
  • Date of First Session: cohort analysis
  • Traffic Sources: segment sessions or users on specific campaign parameters
  • Conditions: segment sessions or users on one or more characteristics
  • Sequences: segment sessions or users on sequential conditions

This full range of possibilities empowers you with great flexibility when setting up segments.

2. Be careful when saving segments

It's easy to hit the blue "Save" button after you are happy with your brand new segment. However, you should be careful if you have a lot of accounts connected to your email address.

As a consultant, I like to build client specific segments (the same applies if your company has multiple websites). If you don't want to mess up your segment structure, you need to head over to "visibility" (on the right side) and select the view(s) in which you like the segment to appear.

Saving custom segments

By default, your new segment is saved in all views connected to your email address.

3. Compare up to four different segments at a time

Google Analytics allows you to select four data segments while looking at your report interface. For example, you might want to compare organic traffic to paid traffic, affiliates and email marketing: in one overview you can see how each channel is performing compared to the others.

Please note that this feature does not work on your funnel reporting, but there are ways around it... read this post if you would like to learn more about how to segment Google Analytics funnel data.

4. Make sure you share them with your team

A great way to empower everybody involved with a specific online business / website, is by sharing your greatest segments. Luckily, it's a breeze to make this happen.

  • Navigate to Admin
  • Under the menu Personal Tools & Assets click on Segments
  • You can copy, share or delete any segment
  • When you share a segment with your team they can configure the shared segment directly by following the link you share with them

As easy as that!

5. Visit the solutions gallery for inspiration

Google Analytics has put together a great gallery where you can access and download shared dashboards, custom reports and segments. If you have never been there, head over to the solutions gallery now, you will find great ideas for segments that fit your business perfectly.

6. Data sampling

Keep in mind that using segments might lead to data sampling in Google Analytics. It's important to verify that the data presented is reliable enough to really act upon.

This post about sampling clearly describes how it works and what you need to know about it. As a general rule, low traffic websites are not affected by data sampling when segments are applied.

7. Segments are not equal to filters

It is important to know that Google Analytics offers different segmentation options:

  1. Segments
  2. Filters
  3. Custom dimensions and metrics (Universal Analytics)
  4. Custom variables (Google Analytics)

Segments and filters each have a different working. Filters are applied at the view level. For example, you add a filter to include only CPC traffic. In this case the involved view will:

  • Limit the data collection to CPC traffic only
  • And this happens for the data collected after you have applied this filter

This is different from segments. They offer a great way of ad hoc segmentation. You can apply this to all sessions collected so far.

I recommend to use filters for data views/segmentations that you use very often. Segments in Google Analytics are great for all kind of ad hoc analysis.

Here is a video that summarizes different types of segmentations on Google Analytics:

8. Segments currently don't work together with custom alerts

Please note that if you are working with segments and custom alerts that they currently don't go together. Newlybuilt segments are not visible in the custom alert setup overview.

Custom Alerts

As a frequent user of custom alerts and segments it would be great to see this one fixed! :-)

What's your experience with segments in Google Analytics? Any great tips to share? If you like the article, we very much appreciate a comment or share!

Bonus: Avinash Kaushik, Digital Marketing Evangelist at Google, wrote a great post about the subject: "Web Analytics Segmentation: Do or Die, There Is No Try!"

image 
Google Analytics Segments
Custom Segments
Saving custom segments
Custom Alerts

A/B Testing For Mobile Apps

$
0
0
A/B Testing For Mobile Apps

This April, Google announced Content Experiments for mobile apps, managed through the Google Tag Manager (GTM) platform. This is great news for any app developer as it allows for faster and more reliable iteration and optimization of app usage in ways that have not been possible in the past.

I previously wrote an introductory post about Google Tag Manager for mobile apps, do check it out if you are looking for an introduction to the product and how it works.

Apps usually have several configuration values hard coded: they determine how the app behaves, what and how content is displayed to users, various settings, and so on. Many challenges app developers face arise when these configurations need to be changed or adapted in some way. Unlike websites, where we can iterate and continuously change such values relatively quick, mobile apps are by nature frozen once published. If we want to change anything after that point, that inevitably involves shipping a new binary to the app marketplace and ultimately hope for the best when it comes to user adoption in the shape of app updates.

This is one of the challenges that Google Tag Manager seeks to address, with the end goal of moving away from a world of constants to dynamism, from static to highly configurable apps.

While GTM has been on the market for a while now, the ability to perform content experiments for mobile apps is a new and revolutionary feature of the product. In line with the agenda for dynamism, this enables app developers to change configurations values at runtime to, for example, try different variations of configurations through A/B - and multivariate tests. The appeal here should be obvious: we are no longer limited by traditional development cycles, which can be both lengthy and costly, as soon as we want to try something new.

This opens up for testing and analysis relating to those important business questions we likely ask ourselves about our apps on a daily basis:

  • How often should x, y, or z be promoted to the user base?
  • What messages are most successful in incentivizing desirable behavior within our app? What is the best wording?
  • Is content variation a or b most effective in driving in-app revenue?
  • What are the best settings to use for a given user segment?

The short answer is that we do not know, until we test each one of them and perform careful analysis of the outcome. And once we are able to answer these questions with statistical significance, we can also easily tie it all back to those other key metrics we are interested in terms of engagement, monetization, interactions, etc.

As always with Google Analytics, at our disposal are the full power of segmentation that is captured both out-of-the box (demographics, technology, behavioral patterns) as well as through our own custom implementations.

Experimenting With Selling Points

In this post, we will be implementing Google Tag Manager content experiments in a mobile app which has a number of USPs (Unique Selling Points) displayed in an effort to drive monetization, as shown below.

Mobile Experiments

These USPs are actually pulled directly from a value collection macro in Google Tag Manager, where we can easily update them if needed directly from the web-interface (fantastic!):

Value Collection Macro

Assume we wanted to change these USPs for whatever reason. Maybe the marketing department has concluded that shorter USPs is the way to go nowadays, and user surveys are in line with the assumption. Despite such indicators, this is clearly not a decision that should be made lightly and without data to back it up.

USPs are an essential part of the value proposition we present to get prospects to buy our product instead of someone else's. It is part of our unique differentiation. In fact, the smallest change to the phrasing of our USPs can have critical impact on our conversion rate and hence our entire business. The copy needs to be just right in terms of highlighting benefits, in grabbing a prospect's attention, and so on. Enter the data scientist. Instead of simply changing our USPs to a shorter, snappier version, we should perform an experiment to see if the marketing department is on the right track before we make a permanent change. In this case, we will test our original copy against the below, shorter version.

  • Fast shipping
  • Free setup
  • No long-term contract

As you can tell, the difference here is in the amount of detail we include. Besides adhering to marketing's will, testing something like this will take us further in understanding what messages our users respond best to and can provide valuable insights to be used in many business analyses to come.

In this example, we assume that we have already implemented GTM in our app following these instructions, and that the values of current USPs are being fetched from our container (at the moment, there is only one version of these USPs, the longer one).

Let's get started!

Step 1: Link the Container to Google Analytics

The first thing we need to do is to link our Google Tag Manager container to a Google Analytics property. Here, we also select in which Views the experiment data should be surfaced. Note that we need to have edit access to both the container and the property to do so.

Link Tag Manager to Analytics

Once we have linked a property to our container, it will be visible under External Account Links in the Google Tag Manager interface.

Tag Manager External Account

Step 2: Create Experiment Macro

Perhaps somewhat surprising, the way we proceed in configuring our experiment is to create a new macro in Google Tag Manager. This makes sense, however, when we consider that what we want to test is ultimately different configuration values, and in Google Tag Manager values are always stored in macros.

Tag Manager Macro

Step 3: Configure Values

The next step is to define our USPs in this new macro. It takes on JSON-formatted name-value pairs as demonstrated below. In our original, we have our long USPs . In Variation 1, the shorter ones. You can create up to 10 variations here.

Value Collection Macros

Value Collection Macros

Note that before we introduced this new content experiment macro in our container, we simply stored our name-value pairs for the USPs in a standard value collection macro. This macro type only accepts one variation set, and should now be deleted as we instead have a content experiment macro defining the USPs from now on.

Step 4: Advanced Options

For the purpose of this demonstration, we will not change from the defaults here, which are displayed below. We will expose 100% of our users to the experiment, set the confidence interval to 95%, run the experiment for two weeks, and once we have a winning version we will serve that to all our users going forward.

Advanced Experiment options

Step 5: Select an Objective

We need to pick an objective with our experiment so that the system knows what to test against. This can be connected to our in-app monetization efforts, user behavior, or system behavior such as exceptions (obviously, when selecting exceptions as the objective, the system will classify a lower rate as better).

Experiment objectives

Step 6: Enabling Rules

Similar to how tags have rules, for content experiment macros we need to decide under which conditions it will run (besides how many users will be exposed). We could, for example, decide to only run our experiment on app users of a particular language. In this case, we will simply set the rule to “Always”, including every user in our experiment.

Testing rules

Step 7: Download And Add Default Container

If this is the first time we ship our app with Google Tag Manager in it, we should add a default container to our project under /assets/tagmanager (again, for an introduction of how to implement GTM in apps, please have a look at this article). For now, we assume we were already calling these USP-values from Google Tag Manager, and simply need to publish a new container with our new content experiment macro in it.

Step 8: Publish

Finally, we need to publish this latest version of our container. This will start our USP-experiment and create a new report for it in the Google Analytics interface as shown below.

Publishing a Google Analytics experiment

Start Analysing

And we're up and running! We will be able to monitor the results of our experiment directly in the Google Analytics interface as it progresses.

Analyzing Experiments

In the end, it turned out our shorter versions of USPs outperformed the longer ones, indicating that marketing was in fact correct in its assumption and that we should move over to this copy permanently.

This, of course, leads to a plentitude of additional questions that we should now ask ourselves about how we work with persuasion within our apps, and it brings us one step closer in being able to understand and model user behavior. One thing is certain: had we simply assumed that "less is more" and implemented a new copy straight away, we might never have been able to back up the assumption with data. We would never have been able to tell whether the shorter version performed better because of externalities we were not aware of, because we never tested it against the longer version simultaneously. By performing the experiment, we empowered ourselves to derive the final decision from data and statistics.

Final Thoughts

Content experiments for mobile apps can be extremely valuable to a business. We are no longer dependent on long development cycles as soon as we want to test something new. And when we do want to test, we can do so faster, more accurately, and with more certainty to avoid costly mistakes based on hunches.

What are your thoughts about content experiments for mobile apps? We would love to hear your inputs!

image 
Mobile Experiments
Value Collection Macro
Link Tag Manager to Analytics
Tag Manager External Account
Tag Manager Macro
Value Collection Macros
Value Collection Macros
Advanced Experiment options
Experiment objectives
Testing rules
Publishing a Google Analytics experiment
Analyzing Experiments

Using Google Tag Manager To Fight Gibberish URLs

$
0
0
Using Google Tag Manager To Fight Gibberish URLs on Google Analytics

In this article I will show how to easily rewrite multiple URI addresses in Google Analytics with the help of Google Tag Manager (GTM) macros. The idea for this solution came when I was asked by an e-commerce client to analyze their website, an online electronic store with over a thousand products in hundreds of categories.

The challenge with this client's analytics account was that 90% of the categories and item names were gibberish, so obviously it was almost impossible to analyze website behavior and content consumption on Google Analytics.

For example the URL for item A was
http://www.thegreatshop.com/default.asp?catid=2DE2700B-A29A-48CA-D17-9F58CF9E76FD&itemid=B5F2AA60-7A13-451A-B532-E6589F178A91

Item B didn't look any much better
http://www.thegreatshop.com/default.asp?catid=98878163-8D17-438C-B75A-692B327A28E0&itemid=37B6B7F6-C408-4331-8034-5A48F8EB493B

A real nuisance.

Before I present my solution, I would like to discuss the two alternative options I considered:

  1. Change the URL to a clearer one and apply a 301 redirect from the old URL. A very risky solution as this was a new client and they were not ready for drastic changes to the site right off the bat, so I eliminated this option.
  2. Add 'search & replace' filters directly into GA. I eliminated this option because adding them manually to thousands of products and hundreds of categories wasn't realistic.

Google Tag Manager Macro Definition & Lookup Tables

Just when I started to regret I took on this job, an idea popped into my head: I will play with the URI on the analytics reports using the macro 'Lookup Table' in Google Tag Manager! If you are not acquainted with Macros, here is an excellent definition by Simo Ahava:

So a macro is, in essence, a placeholder for some value that you might need in your tags. Most common uses for macros would be to:

  • Retrieve a constant string (e.g. your tracking code)
  • Retrieve a variable string (e.g. the href attribute of a clicked element)
  • Retrieve the result of some simple function (e.g. the date and time of the current visit)

Macros are thus used to call for and to retrieve certain values. You can surely see how they facilitate your day-to-day GTM use, can’t you? It’s much easier to just refer to a macro name rather than having to write the value-retrieving function over and over again.

For example, the macro {{URL}} that comes by default with a GTM installation returns the URL of the current site as it appears in the browser. The macro {{element id}} returns the ID of the element that created an Auto Event and so on...

In our case, the macro 'Lookup Table' is actually a function that looks at the value in the macro we put in, and decides what value to return when the macro is called on.

Tag Manager lookup table macro

So here is what we're going to do:

  1. Create a URL macro that will receive the ugly value of the 'catid' parameter
  2. Create a URL macro that will receive the ugly value of 'itemid' parameter
  3. Create a Lookup Table macro that will change all the ugly 'catid' parameters into more meaningful ones
  4. Create a Lookup Table macro that will change all the ugly 'itemid' into more meaningful ones
  5. Create a Universal Analytics tag that will create a nicer virtual pageview instead of the ugly path that we get as a default
  6. Go to the macro we created on step 3 and insert all the URLs from the old and new categories with a very cool trick (more details below)
  7. Go to the macro we created on step 4 and insert all the URLs from the old and new item ids using the same trick
  8. Lay back and watch the data :)

Ready? Lets go!

First let's create the macros that receive the ugly values of the itemid and catid, for that I created two macros as in the example below:

Creating macros

Now we have to tell GTM that when the macro {{oldCatId}} equals (some gibberish value) and push a much nicer category name into {{newCatId}}. To do that I've created a new Lookup Table macro and called it newCatId. Now, in theory, we would have to put in a long list of ugly categories URLs and to each one attach a nice URL that we would want to show as the Request URI in Google Analytics.

Does this look complicated? Of course it does! So here comes Jon Meck's patent!

Download this table (save a copy to your Google Drive because it's read only) and put in the URL of the old categories in the cells on the left (notice you write only the name of the category without the '&catid=' string). After that, fill out in the right column the name of the category as you would like it to appear in your Google Analytics reports, then copy the text that's been created in the green cells and go to the screen where you create the macro in Google Tag Manager.

Now open the developer tool in your browser, go to console and paste the text that you copied from the table right into the console. Hit Enter and... BOOM!! Your table has been filled automatically :) Do the same for the itemid (I called this macro “newItemId”) and thank Jon for his cool solution :)

Now the final step: sending the data to Google Analytics. Pay attention to the details so as not to disrupt your data! To send the data we must create a Universal Analytics tag on GTM (or a classic one if you haven't upgraded yet) and give it the following firing rule:

Google Tag Manager firing rule

The reason we don't want the tag to be on all pages is because the tag is sending virtual pageview to Google Analytics with our new catid and itemid. Not all the pages have these parameters, so if we fire this tag in all pages our homepage will appear as “/?default.asp&catid=&itemid=“ and that's definitely not what we want. Don't forget to add an additional Analytics tag that will run on all pages except those containing 'catid' and 'itemid'.

P.S. According to Simo Ahava's comment below, it is possible to use only the regular pageview tag (that fires on all pages) in this solution. All you need is an intermediate Custom JavaScript Macro, which you refer to in the Document Path field. This macro looks like:

function() {
if({{oldCatId}} && {{oldItemId}}) {
return "/?default.asp&catid={{newCatId}}&itemid={{newItemId}}";
}
return;
}

So this first checks if catId and itemId are present in the URL. If they are, it returns the string you want in the Document Path field, together with the macros from the Lookup Table. If these parameters are NOT in the URL, it returns undefined, and Document Path doesn't get sent!

Now let's put the virtual pageview in the Document Path field (this is where the magic happens!) that looks something like this: /?default.asp&catid={{newCatId}}&itemid={{newItemId}}


Now this address:
/default.asp?catid=2DE2700B-A29A-48CA-D17-9F58CF9E76FD&itemid=B5F2AA60-7A13-451A-B532-E6589F178A91

will look like this:
/?default.asp&catid=NicerCategoryName&itemid=NicerItemName

When the tag fires, it calls the newItemld and newCatId macros (Lookup Table functions, remember?) that will return the matching value according to what they receive from the oldltemld and oldCatId macro.

Summary

In this post I attempted to explain how to rewrite a large number of URI addresses in Google Analytics in the easiest possible way. The hardest part of using this method is entering all the gibberish URLs of the categories/items into the spreadsheet table, and to write a nicer name for each one. There is no solution for this other than hiring cheap labor :-)

I hope this helped you and... happy analyzing!

Related Content

  1. Google Tag Manager (GTM): What Do I Need To Know?
  2. Google Tag Manager: Coding & Naming Conventions
  3. A Guide to Google Tag Manager for Mobile Apps
image 
Tag Manager lookup table macro
Creating macros
Google Tag Manager firing rule

In-Page Analytics Chrome Extension (by Google)

$
0
0
In-Page Analytics Chrome Extension (by Google)

How often do you browse your website and think to yourself: "I wonder how often people visit this page and how successful it is?" If you care about your website content you probably ask yourself this kind of questions very often...

That's why the Google Analytics team recently released a Chrome Extension that allows you to get detailed information about each page of your website while you browse it. Below I will go through some of the extension's features and how to use it to get a better idea of what is going on in your website.

In order to use the extension you will need any kind of Google Analytics permission to the website you are analyzing, a Chrome browser and the Extension (download here). Once you have those three, you can click on the Google Analytics icon on your browser while browsing your website (the icon is usually found on the top right corner of the page). BAM!

Below is the extension's interface map with all its functionalities followed by an explanation of each.

Google Analytics In-Page Analytics

1. Choose Your Google Analytics Account

First things first... choose the right property/view. In this drop down you will have a list of the properties you have access too where the default URL matches the URL you are currently browsing.

2. Analyze Segments

As Avinash Kaushik once said: "Segment or die" - and this was no exaggeration! Businesses that want to survive online must segment like crazy in order to understand their customers. In this drop down (#2 on screenshot above) you will find all the Segments you have created on Google Analytics. Here, you can look at how each page is performing for different segments.

3. Compare Time Ranges

When analyzing data, looking at trends is an important technique, you should never analyze a single data point. And while watching trend over time is usually a great thing to do, sometimes you want to compare summer to summer (for example). Using this feature you can do just that.

4. "View in Google Analytics"

There is only so much you can do with an extension a few pixels high. Sometimes deeper analysis require more advanced tools available only in the full application. This link is extremely helpful in that it takes you directly into the data you are looking at, making the transaction between the extension and the full Google Analytics interface very smooth.

5. Minimize The Extension

As you will notice, the extension takes quite a lot of real estate, so if you are not in the mood for analysis you might choose to minimize it. You will notice that you will still have a small box telling you how many visitors you have in the website right now. I never get tired of looking to it!

6. Visualization Types

This option will let you choose whether you want to see bubbles, colors or both in the page you are looking at. This will help you visualize how often and where your customers click on specific page links.

7. Send Feedback

Self explanatory :-)

Closing Thoughts

Hopefully this article helped you understand the new Google Analytics Chrome extension, and hopefully it will help you understand your customers in a better and faster way.

image 
Google Analytics In-Page Analytics

Visualizing Google Analytics Data With R [Tutorial]

$
0
0
Visualizing Google Analytics Data With R

In the last few weeks I have been quite immersed in data visualization, trying to understand how it can be used to turn data into insights. As part of my immersion, I have played with Fusion Tables and Google Analytics, and also other ideas that will come to light in the future... As I wrote in the Fusion Tables article, I think everyone secretly wishes to do crazy visualizations with Google Analytics data sometimes, both because it can very insightful or just incredibly fun :-)

And here I am again, with another custom visualization! But this time I decided to use the R programming language, which is considered to be one of the best options when it comes to statistical data visualization.

As I looked deeper into R, I tried to understand what kind of visualizations would complement Google Analytics (GA), i.e. what can we get out of R that we can't currently get out of GA. My first idea was to try and create a visualization that would allow me to look at my top 5 US states by number of visits (or countries if you wish) and see how they are performing side by side. In addition, I wanted to see how Christmas and a TV campaign affected the behavior across US States. While this is possible to understand using Google Analytics, I believe it would not be possible to visualize it in such a way.

Once I found this interesting use case, I decided to take my artistic capabilities out of the rusty box and sketch the output I was looking for... and here is what I got.

Data Visualization sketch

With this objective in mind, I rolled up my sleeves and started working... Below is a step-by-step guide on how to build a very similar visualization using your own Google Analytics data. If you know your way through R, you can simply download this commented txt file.

Important: please note that while I try to describe the process as detailedly as possible, an introduction to R would be very recommended. If you have some time to invest try the Computing for Data Analysis Coursera course, or just watch the YouTube playlist Intro to R. I am also providing a list of helpful books in the end of the article.

Installing R, the Google Analytics package and others

If you are completely new to R, you will first need to download R and follow the instructions to install it. After you do that, I recommend you also install R Studio, a great tool for you to write and visualize R code.

Now download the Google Analytics package into your R workspace (below I am using version 1.4). If you don't know where is your workspace just type the line below into your console.

getwd()

Enter the following lines into R to install and load the respective packages, they are necessary for this visualization.

install.packages(c("RCurl", "rjson", "RGoogleAnalytics", "ggplot2", "plyr", "gridExtra", "reshape"))
require("RCurl")
require("rjson")
require("ggplot2")
require("plyr")
require("gridExtra")
require("reshape")
require("RGoogleAnalytics")

Getting the data and preparing it for visualization

Step 1. Authorize your account and paste the accesstoken - you will be asked to paste it in the console after you run the second line below.

query <- QueryBuilder()
access_token <- query$authorize()

Step 2. Initialize the configuration object - execute one line at a time.

conf <- Configuration()

ga.account <- conf$GetAccounts()
ga.account

// If you have many accounts, you might want to add "ga.account$id[index]" (without the ") inside the ( ) below to list only the web properties inside a specific account.

ga.webProperty <- conf$GetWebProperty(ga.account$id[9])
ga.webProperty

Step 3. Check the ga.account and ga.webProperty lists above and populate the numbers inside [ ] (i.e., substitute 9 and 287) below with the account and profile index you want (the index is the first number in each line of the R console). Then, get the webProfile index from the list below and use it to populate the first line of step 5.

ga.webProfile <- conf$GetWebProfile(ga.account$id[9],ga.webProperty$id[287])
ga.webProfile

Step 4. Create a new Google Analytics API object.

ga <- RGoogleAnalytics()

Step 5. Setting up the input parameters - here you should think deeply about your analysis time range, the dimensions (note that in order to do a line chart for a time series you must add the "ga:date" dimension), metrics, filters, segments, how the data is sorted and the # of results.

profile <- ga.webProfile$id[1]
startdate <- "2013-12-08"
enddate <- "2014-02-15"
dimension <- "ga:date,ga:region"
metric <- "ga:visits, ga:avgTimeOnSite, ga:transactions"
filter <- "ga:country==United States"
sort <- "ga:date"
maxresults <- 10000

Step 6. Build the query string, use the profile by setting its index value.

query$Init(start.date = "2013-12-08",
           end.date = "2014-02-15",
           dimensions = "ga:date, ga:region",
           metrics = "ga:visits, ga:avgTimeOnSite, ga:transactions",
           sort = "ga:date, -ga:visits",
           filters="ga:country==United States",
           max.results = 10000,
           table.id = paste("ga:",ga.webProfile$id[1],sep="",collapse=","),
           access_token=access_token)

Step 7. Make a request to get the data from the API.

ga.data <- ga$GetReportData(query)

Step 8. Check your data - head() will return the first few lines of the table.

head(ga.data)

Step 9. Clean the data - removing all (not set) rows.

ga.clean <- ga.data[!ga.data$region == "(not set)", ]

Step 10. Choose your data - get the data for the specific states (or countries) that you want to analyze. Notice that I am using only the Top 5 countries as I think more than that would be a bit too much to visualize, but it is up to you.

sum <- ddply(ga.clean,.(region),summarize,sum=sum(visits))
top5 <- sum[order(sum$sum,decreasing=TRUE),][1:5,]
top5

Step 11. Build the final table containing only the countries you want.

d <- ga.clean[ga.clean$region %in% c("California", "Texas", "New York", "Florida", "Illinois"),]

Building the visualization: legends and line charts

Step 12. Build the special campaign bars and legend (in this case Christmas and Campaign)

g_legend<-function(a.gplot){
  tmp <- ggplot_gtable(ggplot_build(a.gplot))
  leg <- which(sapply(tmp$grobs, function(x) x$name) == "guide-box")
  legend <- tmp$grobs[[leg]]
  return(legend)}

rect_campaign <- data.frame (
  xmin=strptime('2014-01-25',"%Y-%m-%d"),
  xmax=strptime('2014-01-30', "%Y-%m-%d"),
  ymin=-Inf, ymax=Inf)

rect_xmas <- data.frame (
  xmin=strptime('2013-12-25',"%Y-%m-%d"),
  xmax=strptime('2013-12-26', "%Y-%m-%d"),
  ymin=-Inf, ymax=Inf)

fill_cols <- c("Christmas"="red",
               "Campaign"="gray20")

line_cols <- c("avgTimeOnSite" = "#781002",
               "visits" = "#023378",
               "transactions" = "#02780A")

Step 13. Build the chart legend and axis.

get_legend <- function(data) {
  d_m <- melt(data,id=c("region", "date_f"))
  p <- ggplot() +
    geom_smooth(data = d_m, aes(x=date_f, y=value,group=variable,color=variable),se=F) +
    geom_rect(data = rect_campaign,
              aes(xmin=xmin,
                  xmax=xmax,
                  ymin=ymin,
                  ymax=ymax,
                  fill="Campaign"), alpha=0.5) +
    geom_rect(data = rect_xmas,
              aes(xmin=xmin,
                  xmax=xmax,
                  ymin=ymin,
                  ymax=ymax,
                  fill="Christmas"), alpha=0.5) +
    theme_bw() +
    theme(axis.title.y = element_blank(),
          axis.title.x = element_blank(),
          legend.key = element_blank(),
          legend.key.height = unit(1, "lines"),
          legend.key.width = unit(2, "lines"),
          panel.margin = unit(0.5, "lines")) +
    scale_fill_manual(name = "", values=fill_cols)  +
    scale_color_manual(name = "",
                       values=line_cols,
                       labels=c("Number of visits", "Average time on site","Transactions"))
  legend <- g_legend(p)
  return(legend)
}

Step 14. Build the charts!

years <- substr(d$date, 1, 4)
months <- substr(d$date, 5, 6)
days <- substr(d$date, 7, 8)
d$date_f <- strptime(paste(years, months, days, sep="-"), "%Y-%m-%d")
d$date <- NULL
d$X <- NULL

l <- get_legend(d)

p1 <- ggplot(d, aes(x=date_f, y=visits,)) +
  geom_line(colour="#023378") +
  ggtitle("Number of visits") +
  geom_rect(data = rect_campaign,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="grey20",
            alpha=0.5,
            inherit.aes = FALSE) +
  geom_rect(data = rect_xmas,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="red",
            alpha=0.5,
            inherit.aes = FALSE) +
  facet_grid (region ~ .) +
  theme_bw() +
  theme(axis.title.y = element_blank(),
        axis.title.x = element_blank(),
        panel.margin = unit(0.5, "lines"))


p2 <- ggplot(d, aes(x=date_f, y=avgTimeOnSite,)) +
  geom_line(colour="#781002") +
  geom_rect(data = rect_campaign,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="grey20",
            alpha=0.5,
            inherit.aes = FALSE) +
  geom_rect(data = rect_xmas,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="red",
            alpha=0.5,
            inherit.aes = FALSE) +
  facet_grid (region ~ .) +
  ggtitle("Average time on site") +
  coord_cartesian(ylim = c(0, 250)) +
  theme_bw() +
  theme(axis.title.y = element_blank(),
        axis.title.x = element_blank(),
        panel.margin = unit(0.5, "lines"))

p3 <- ggplot(d, aes(x=date_f, y=transactions,)) +
  geom_line(colour="#02780A") +
  facet_grid (region ~ .) +
  geom_rect(data = rect_campaign,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="grey20",
            alpha=0.5,
            inherit.aes = FALSE) +
  geom_rect(data = rect_xmas,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="red",
            alpha=0.5,
            inherit.aes = FALSE) +
  ggtitle("Number of transactions") +
  theme_bw() +
  theme(axis.title.y = element_blank(),
        axis.title.x = element_blank(),
        panel.margin = unit(0.5, "lines"))

grid.arrange(arrangeGrob(p1,p2,p3,ncol=3, main=textGrob("US States: Website Interaction & Commerce", vjust=1.5)),l,
             ncol=2,
             widths=c(9, 2))

Phew! Here is the chart you should get!

The chart is not exactly what I initially thought, but having all metrics in one chart was a bit problematic as the scale was too different and we would barely see the transactions chart. But I like it this way :-)

Google Analytics R Visualization

If you already use R to analyze and visualize Google Analytics data, send us an email, we would love to publish other examples.

Books to learn R

  1. Learning R
  2. R Graphics Cookbook
  3. ggplot2: Elegant Graphics for Data Analysis
  4. Discovering Statistics Using R
image 
Data Visualization sketch
Google Analytics R Visualization
Google Analytics R Visualization

Google Analytics Demographic Segmentation Techniques

$
0
0
Google Analytics Demographic Segmentation Techniques

We all know that Google Analytics is a powerful tool for diagnosing and understanding what happened on our website and by how much; that's why websites like Online Behavior exist. What may not be clear is how to use this data we collect to find more customers.

For example, our reports in Google Analytics may tell us that Facebook is driving tons of conversions and revenue, but it is not easy to attribute that revenue back to an influential poster within their walled garden. Aside from reflecting on our own Facebook advertising efforts tagged with UTM variables, it's difficult to use past traffic performance to predict future success.

The same goes for (not provided) organic keywords, many forms of social media and even remarketing. We can take credit for good things when they happen, but it's not always apparent how we can find more of these good things in the future.

We could try to boil the ocean and advertise to all users of a social network trying to replicate our success, but that would be the equivalent of flushing our budgets down the toilet. Since we aren't working the traditional media industry, I don't think we could get away with that level of inefficiency.

How do we go about finding more customers if we can't recreate the blueprint for our past success?

Segmentation

By my unofficial count, there are approximately a million ways to segment your users within Google Analytics. One of my favorite segments for predicting future success is looking at the demographics reports, because they allow us to look at the age range and gender of our customers to a great level of accuracy.

We can use this information to develop user personas and backstories and target by age/gender on advertising platforms. We can use this information to come closer to our goal of achieving a 1:1 audience with potential customers. Stacking demographic on top of our existing customer data allows us to turn the needle in a haystack into a haystack full of needles.

Where to get started? Here are some practical examples of how we can use the demographics reports in GA to find more of our best customers.

Demographics to improve organic search conversions

Let's start with a report showing conversion revenue and conversion rates for an e-commerce store for the top 5 traffic sources. As you can see, the average e-commerce conversion rate of 1.61% is misleading, because there are no traffic sources that are even close to that conversion rate. Everything is either much higher or much lower, because as I like to say averages are lies.

Conversion Rate by Channel

By looking at conversion rates by channel grouping instead of the overall conversion rate for the site, we can draw many conclusions from this data. For example, we can conclude that Organic Search is under-performing compared to other traffic sources and that email marketing is a money making machine. But how do we improve upon something as vast as organic search conversion rate?

Start by digging deeper to see if a more clear answer emerges. Let's look at the revenue spread by age. Are there certain age groups where organic search is performing better?

Organic Conversion Rate by Age

Note: since demographics data is not available for all users, the numbers below do not match the above perfectly. In this case, partial data does not change my approach to analysis. Also, I am using a custom report in order to get all into one screenshot.

When looking at the age of our organic searchers, are there segments of the population who jump out as being above average converters? Absolutely.

In this case it is quite clear that website visitors to the site who are 45+ years old will tend to purchase more often than their younger counterparts. Keep this in mind as you begin the optimization process.

What about gender? Do females and males see a difference in conversion rate and revenue?

Organic Conversion Rate by Gender

Once again, the answer is yes. While organic visitors are 58% male, 72% of transactions and 67% of organic search revenue is coming from this population group. This segment is clearly doing much better than average and will likely deserve a greater share of attention.

Putting this into action

Now that we know our best organic search customer is a 45+ male, how do we act on this information? I would start by asking the following questions:

  • How closely do our demographic conversion numbers match our ideal customer profile?
  • Are our pages already appealing to the 45+ year old male?
    • If yes, does that explain the high conversion rate?
    • If no, should we adjust content to appeal even more?
  • What type of imagery are we using on our pages to make users feel welcome?
    • Would a 25-34 year old female find these images appealing?
  • Is there an opportunity to appeal to our poorly performing segments on our top performing organic landing pages?
  • Can we see enough revenue opportunity in Organic Search to make any necessary changes to our website?

By answering these questions honestly, we are able to determine where to make improvements to our programs. One improvement opportunity is appealing even more to the users that bring in the most revenue. Another opportunity is to try and bring up conversion rates for those who have not performed well traditionally. The best plan of action is likely to do a little of both.

Google Analytics is not telling us the answer to our problems outright, but mixing in demographics data is giving us the information we need in order to create a logical backstory for our visitors. From there we must take action to see improvement.

Demographics for email engagement

From our worst performing traffic channel above to our best, how can we improve upon an already successful email program? Let's segment by demographics to see if there are any winners here, starting with age.

Email by Age Range

Look familiar? It should - the same customer age ranges are performing well in email that performed well in organic search. But what about by gender?

Email Performance by Gender

Once again, it is clear that the top performing customer segment is a 45+ male.

How do we take action?

I thought you would never ask! While often mystifying to search marketers, it should be known that email marketing is the ultimate delivery vehicle for segmented messages. Assuming that you have collected the birthday and gender of your customers, you should know exactly who is in your email database.

Using the demographic performance data above, it appears that the content within the emails that are being sent out appeal primarily to 45+ males. This is the largest segment of the email population, representing around 30-40% of the email database. While the emails may be working for this population segment, it means that over half of the database is receiving the wrong message!

Send them a different email

Sending a slightly different email to half of your database should not be too difficult. You can take your standard email and change the headline and imagery used within to appeal to a different segment. Before long you can start to measure the impact this has in improving conversion rates and revenue for your previously neglected customers.

By splitting your email list into two or more customer groups, you can see a huge increases in your overall email marketing revenue.

Demographics for targeted social media ad buys

Back to the topic of traffic and revenue coming from your website through social media. I opened this article by stating that you can use Google Analytics to understand the traffic driven to your site through Facebook, but supplemental information about these users is not always easy to come by. This is because Facebook referrals don't provide a lot of useful information to our web analytics tools. They simply let us know that a visit happened.

Facebook Traffic

But what about when we segment by age and gender? We start to see specialized areas for targeting users by age, gender and even interest categories.

What can we do with this information? Advertise to similar users! Social ad networks allow for a full range of targeting by age, gender and interest categories. We can use our past successes to buy ads on any number of social networks that may be of interest to our users. Instead of boiling the ocean to find more customers, we are making an educated guess as to who our next wave of customers will be.

Where else can we use demographics?

Everywhere. Demographics can come in handy in just about any form of marketing you can imagine. Here are three additional ideas.

1. Demographics for internal promotions

For companies engaged with email marketing and CRM, you can use the demographics of your logged in users to show them targeted internal promotions. If there are certain products or landing pages that appeal better to their population segment – use dynamic internal promotions to get them there faster!

2. Demographics for paid search improvements

Understanding the demographics of your paid search visitors can help you make landing page improvements, adjust your keyword list and reduce spend to inefficient ad groups and keywords.

3. Demographics for remarketing

The categories you see in the GA demographics reports align perfectly with those in Google Remarketing tools because... surprise! They are both Google products. Use remarketing on only those visitors who have the most conversion potential.

Closing Thoughts

The possibilities are limitless when it comes to demographics, because age and gender can be used to personify our ideal customers. Once we start to imagine our potential customers as real people instead of numbers, we can provide them with a better buying experience.

Now it's your turn. Who are you going to target next based on demographics? I would love to hear your ideas in the comments section.

image 
Conversion Rate by Channel
Organic Conversion Rate by Age
Organic Conversion Rate by Gender
Email by Age Range
Email Performance by Gender
Facebook Traffic

Dashboard for AdSense Performance Analysis

$
0
0
AdSense Performance Dashboard

Not long ago I published a post on the Google Analytics blog about the power of using AdSense and Google Analytics together. In that post I went over the integration between both tools, described the reports available after linking your accounts, and also proposed techniques to optimize AdSense revenue using Google Analytics.

In this post I offer a dashboard that can be used to measure your most profitable channels, pages and demographics when it comes to AdSense revenue. You can add the dashboard to your Google Analytics account by following this link (make sure to be signed in to your account).

Below is a screenshot of the dashboard. As you can see by the different colors, each column has a theme; the first column shows overall performance metrics over time (widgets 1-4), the second focuses on demographics (widgets 5-7), and the third shows important information on behavior and acquisition (widgets 8-10). I discuss the widgets in each of the three themes.

AdSense performance dashboard

Overall Performance Trends

1. AdSense Revenue vs. Total Sessions - shows the overall performance of the website. If you see diverging trends on the lines, it means that something worth checking is happening, drill down into it.

2. Ads Clicked vs. Ads Viewed - shows a trend of the absolute number of ads people are viewing on the website and how many of them are being clicked.

3. AdSense CTR - this widget summarizes the one above, the Click-Through Rate (CTR) is the percentage of page impressions that resulted in a click on an ad. You definitely want to see an upwards trend.

4. AdSense eCPM - the AdSense eCPM is the estimated cost per thousand page impressions. It is your AdSense Revenue per 1000 page impressions, a great performance metric.

Demographic Segments

Please note that the widgets below depend on having Demographics enabled for your account, learn more about it on this Help Center article.

5. AdSense Revenue by Country [Male vs. Female] - this stacked bar chart shows the AdSense revenue per country and each bar is divided between Males (blue) and Females (green). As we can see in the example above, Australia and Germany are heavily biased towards man, so a good tactic might be to find content that is particularly appealing to women and promote on the homepage of those countries.

6. AdSense Revenue by Age [Male vs. Female] - this stacked bar chart shows AdSense revenue per age group and each bar is divided between Males (blue) and Females (green). In the example above we can see that very old and very young visitors are heavily biased towards men, but all other age groups are biased towards females, especially 55-64. In the same spirit as above, it might be interesting to run a content analysis and adjust content strategy based on that.

7. AdSense Revenue by Affinity Category - this bar chart shows the AdSense revenue per affinity category. This information might be interesting to understand which groups are the most interesting in terms of revenue and might help driving the content strategy for the website.

Behavior & Acquisition

8. Revenue and Ads Clicked by Device - as we all know, we live in a mobile planet, so it is important to check if your ads are being clicked and generating revenue in all devices in a similar rate.

9. AdSense Revenue by Page - this table is a great indicator of which content is performing well and how much time you should invest in each topic.

10. AdSense Revenue by Channel - this table shows which acquisition channel is bringing the most profitable visitors.

Closing Thoughts

If you find this dashboard useful, you can download it from this link (make sure to be signed in to your account). If not, you can find dozens of other great dashboards in our Solution Gallery.

You can also take a look at Google Analytics dashboard guide which discusses dashboards best practices as well as the features available when creating and formatting dashboards.

image 
AdSense performance dashboard

Custom Definitions in Universal Analytics: A Case Study

$
0
0
Custom Definitions in Universal Analytics

In this post, I will cover some of the benefits of using Custom Definitions in Universal Analytics, as well as have a look at how they are leveraged in practice by startup The Beta Family, a crowdsourcing platform for beta testing apps. In short, Custom Definitions are a way of sending custom meta data from your website or application to Google Analytics. I will not go into more details about the foundations of Custom Definitions here, except to mention that they are really easy to implement once you get the hang of it. You can read more about them at the help center available here.

Ever got stuck contemplating what Google Analytics methods to use in your implementation? Or how to best present them in a report so that your colleagues will understand the data? An inherent limitation of many analytics tools is, unfortunately, a layer of complexity that is the jargon of the tool itself. The usage of different hit types, how data is surfaced in reports, and all the particular quirks which only the most eager Analytics-fanatic learn over time can really stop you in your tracks in communicating data clearly.

The availability of Custom Definitions in Universal Analytics is a big step in the right direction for this particular issue. They enable highly customized reports that will make sense to people on a larger scale. It may sound trivial, but the implication of being able to name your own metrics and dimensions can be huge. It means that you remove an obstacle which can obstruct the effectiveness of data to an organization simply because people don’t understand it fully.

For example, instead of knowing what an event or a unique pageview represents in Google Analytics, and how they should be interpreted, what if you could simply call things for what they are as you create reports?

Custom Definitions need to be attached to Google Analytics hits (such as Events or Pageviews) when implemented, but by simply including them in these hits you can actually avoid the usage of the actual Google Analytics terminology in reports. The implementation, and the understanding of methodology, should be your problem, the specialist, not your less savvy colleagues.

With the main goal of creating comprehensible Google Analytics reports, for potential investors as well as for internal use, we decided to rely heavily on Custom Definitions throughout the implementation on The Beta Family.

Case Study: The Beta Family

Custom Definitions case studyThe Beta Family is a crowdsourcing platform for beta testing and finding testers to iOS and Android applications. Developers can test apps on real people and get an honest opinion on the user experience. Testers can try new apps, and get rewarded if they write a good test report.

The website is a typical showcase for when the ability to slice and dice customized data will be beneficial. Each test, developer, and tester can be categorized in a variety of different ways, and attempting to capture all of these dimensions solely with standard Google Analytics Events and Pageviews would get messy very soon. Perhaps above all, potential investors would likely have a hard time understanding precisely what they are looking at unless presented with simple, straight forward reports that really show how the business is performing in relation to its main KPIs.

It is very important for us, and likely for all SaaS product owners, to get the full picture of the traffic, funnels and conversions. We are using Google Analytics to get a deeper knowledge of how our product is used and what we need to change to make the product better and make more money. The data is also used to motivate our team in making a better product and used in discussions with potential investors.

- Axel Nordenström, CEO & Founder, The Beta Family

Some of the Custom Definitions implemented are demonstrated in this post. Each one serves its purpose in introducing granular segmentation power and satisfying stakeholder requirements in terms of reporting and allows us to create highly customized reports, such as exemplified in the dashboard below.

Custom Dashboard

Sample Custom Definitions

  • Plan Type (Dimension): When running tests, a developer can choose between the Free option, a Pay-As-You-Go plan, or a Monthly Subscription.
  • Test ID (Dimension): Each test is given an ID, and obviously we want to pass this to GA to be able to drill down and connect the dots between individual tests and the abundance of GA data available to us for deeper analysis.
  • Timestamps (Dimensions): We leverage several types here, such as user Registration date, or the Test creation date, to enable us to perform various cohort analyses without advanced querying.
  • Test created (Metric): This allows us to measure the number of tests created, using a Custom Metric incremented on each occurrence.
  • Test started (Metric): This allows us to measure the number of tests started.
  • Apply to test (Metric): This allows us to measure the number applications to each test.
  • Accept applicant (Metric): This allows us to measure the number of accepted applicants to each test.
  • Invite to test (Metric):This allows us to measure the number of invites to each test.
  • Invite accepted (Metric):This allows us to measure the number of accepted invites to each test.

Sample Reports

On The Beta Family, users can apply to participate in tests, or developers can invite users. In both cases, the other party needs to accept the invitation or application.

We wanted to surface this for each plan type to be able to carefully monitor the trend for these actions, incrementing Custom Metrics for test creation, invite, accepted invite, application, and accepted application. In combination with setting a Custom Dimension containing the plan type for each test, we will be able to, for example, create a report like the below to monitor the activity around applications for each plan type:

Custom Report

Or, we can surface the number of invites and accepted invites for each test. Since we are capturing an ID for each test, we can also drill down to a test-level for each plan type, as such:

Custom ID report

We can tie our custom KPIs, captured through Custom Definitions, to an essential metric such as advertising spend to see where we get the most bang for our buck:

Custom channels report

Or see how our test applicants are spread out geographically:

GEO location custom report

Drilling down to our tests with a Test Details dimension for each metric, we can segment by demographics, the reward amount, etc.

Demographics custom report

With all these dimensions and metrics available to us, we are also able to set up customized advanced segments to slice and dice our user base further. How about users who have applied to participate in more than five tests, yet have never submitted a report?

Smart Segment

How about our most ambitious developers: how do users who have created more than ten tests, since registering in May this year, behaving over time?

Multi-visit segment

As you can see, you will be able to create an abundance of various combinations to answer whatever business questions you might have, as long as you have the data presented in a way that is reasonable to understand and by extension, actionable.

Summary

The above examples exhibit some of the custom data now available to us when working in Google Analytics, reports that would have been more time consuming to get right without the Custom Definitions at hand. And, more importantly, these custom reports are more straight forward than standard reports in that they mirror what is actually going on on the website, using the same terminology.

Custom Definitions enable us to create highly customized reports that will make sense even to the non-Analytics-savvy. For The Beta Family, they allow for the creation of almost any report that potential investors could ask for, a very useful resource for a startup looking for financing. You do not need to be an Analytics wizard to fully grasp the data presented to you if the tool’s jargon is minimized. The use of Custom Definitions will therefore ultimately benefit both the specialist providing the report, as well as its end consumers.

How are you using Custom Definitions to create useful reporting? We’d love to hear your input.

image 
Custom Definitions case study
Custom Dashboard
Custom Report
Custom ID report
Custom channels report
GEO location custom report
Demographics custom report
Smart Segment
Multi-visit segment

More Context With Google Analytics Benchmarking

$
0
0
Google Analytics Benchmarking Tool

Back in 2008, the Google Analytics team released a feature that was extremely innovative: Benchmarking. Fast forward a few years, the format changed to a monthly email newsletter, which users would get in their inboxes instead of browsing the Google Analytics user interface. This week the newest version of it has been launched, and it looks way better! (see the evolution in the end of the article)

Before we go into the reports and how you can use them, I would like to delve a little bit on why Benchmarking is such an extremely important tool. I believe that the word that best describes the value of Benchmarking is 'context'. I often talk about context in my presentations, and how you can use features like Annotations, Cost Data Upload and others to add more context to your data. This will enable you to understand as much around a fact as possible, making conclusions more meaningful and accurate. The word 'context' comes from Latin contextus, from con'together' + texere'to weave'. And that's what we are trying to do, isn't it? To weave together a meaningful story from the data.

Benchmarking is extremely valuable in that it provides the necessary context to show you if your increase (or decrease) is not only a result of your actions, but of a larger trend in the industry. So, for example, if you see an large spike in traffic from the USA, is it something that all your competitors are seeing too or is it a result of your latest campaign? Or if you see an increase on mobile visitors, is it a result of a global trend or a result of your latest mobile-friendly design? Google Analytics Benchmarking will help you answer that.

Enabling Google Analytics Benchmarking

Benchmarking is still not available to 100% of users, it will be rolled out in the coming weeks. But once it is available, you will see a menu named Benchmarking under the Audience section on the left navigation of Google Analytics. If you click on it, you might see the note below, depending on your data sharing settings (if you don't see it, you are good to go).

Enabling Google Analytics Benchmarking

Following the link will lead you to your data sharing settings, where you need to check the box with the text shown below to have access to this feature.

Anonymously with Google and others

Enable benchmarking by sharing your website data in an anonymous form. Google will remove all identifiable information about your website, combine the data with other anonymous sites in comparable industries and report aggregate trends in the benchmarking service.

Example Use: Google Analytics Industry Benchmarking

  • Use Benchmarking to compare your site's performance with those of other websites in your industry.
  • Pinpoint performance problems and estimate how much you can improve your site metrics.

Benchmarking Reports

The Benchmarking tool is composed of three reports: Channels, Location and Devices. We will go over each of them, but before, let's understand how the benchmarks are defined and how can you tweak them. Below is a screenshot from the tool showing the navigation options.

Benchmarking Reports

  1. Industry Vertical: this allows you to choose which vertical (and sub-vertical (and sub-sub-vertical)) you would like to compare your website to. Note that the deeper you go on the vertical hierarchy, the smaller will be the sample you are comparing your website against.
  2. Country / Region: this allows you to choose which country and region you want to benchmark against. As above, if you benchmark against USA the sample will be much larger than benchmarking against California, so there is a tradeoff between accuracy and precision.
  3. Size by daily sessions: the size of the business in terms of average number of daily sessions.
  4. Benchmark group size: the number of properties that contribute data to establish this benchmark.
  5. Benchmark dimension to chart: you can choose among various dimensions to be shown in the chart. Here are the options: % Benchmark New Sessions, % New Sessions, % New Sessions Benchmark Delta, Benchmark New Users, Benchmark Sessions, New Users, New Users Benchmark Delta, Sessions Benchmark Delta, Avg. Benchmark Session Duration, Avg. Session Duration, Avg. Session Duration Benchmark Delta, Benchmark Bounce Rate, Benchmark Pages / Session, Bounce Rate, Bounce Rate Benchmark Delta, Pages / Session, Pages / Session Benchmark Delta.

Apart from those navigation options, you will also see, just above the chart, two new icons that can be use to show/hide the colors on the table (green/red) and the comparison numbers.

Channels Benchmarks

Channel Benchmarks

This report will help you understand how you are performing against similar businesses when it comes to acquisition channels. So, for example, it will help you answer questions such as:

  • Am I doing well on my Email campaigns?
  • Should I invest more in Display Advertising?
  • Is my website SEO optimized compared to my competitors?

Location Benchmarks

Location Benchmarks

This report will be useful to understand where in the world you are doing well and where you are doing worse. If you operate only in one country it will be simpler to understand, but if you have a global audience it might be interesting to use this together with the other reports. For example, if you see that you are underperforming in Canada, you might go to the Channels report and see in which channel you are doing worse in Canada.

Devices Benchmarks

Devices Benchmarks

Did you hear about the mobile planet? Yes, we live there! This is an extremely insightful report that will show you if you are missing customers because your competitors are better adapted to mobile.

Benchmarking Evolution

Just for fun, I thought I would add screenshots of the previous Benchmarking features on Google Analytics...

Benchmarking Reports - 2008 Edition

Old Benchmarking Report

Benchmarking Newsletter - 2011 Edition

This was the very first edition!

Benchmarking Newsletter

Benchmarking Tool - 2014 Edition

New Google Analytics Benchmarking

image 
Enabling Google Analytics Benchmarking
Benchmarking Reports
Channel Benchmarks
Location Benchmarks
Devices Benchmarks
Old Benchmarking Report
Benchmarking Newsletter
New Google Analytics Benchmarking

Google Analytics Embed API Highlights

$
0
0
Google Analytics Embed API Highlights

Early this spring the Google Analytics team released the Embed API, "a JavaScript library that allows you to easily create and embed a dashboard on a 3rd party websites." Have you had a chance to take a look at it? When I heard they were releasing a new API I took a quick look at it, saw that it was JavaScript based, and just accessed the API. At the time I assumed it was just JavaScript connect to Google API mashed together with Google Charts, I was unimpressed and decided I had better things to do with my time.

Earlier this week I ran across it again in my inbox: "create and embed a dashboard" it said. I am the resident Google Analytics addict here at work, if I want to keep my reputation intact, I had better take a closer look at this.

What I found was amazing! The Embed API isn't just a javascript library mashed up with Google Charts, or is it? It might be, but what really grabbed my attention is that it is EASY, you can have a dashboard laid out and set up in a matter of minutes, even if you don't know much Javascript. If you can read Javascript and copy / paste the examples you can get it working. If you're more advanced, well, I bet you could come up with some really fancy stuff here (if you do please post it in the comments, I would be interested in seeing it). I think we should create some kind of library of standard Embed API Dashboards, maybe Google should expand the Solutions Gallery to include Embed API Dashboards.

Embed API

One of the really nice things is that the examples and documentation are very well written for this, everything is very easy to follow. Because of that, this is not going to be a tutorial on how to use the Embed API, I will just highlight a few key points.

Developer Console

The Embed API is, after all, an API, so we are going to have to register the application with the Google Console. The documentation isn't 100% clear on how to do this, so follow these steps and you will have it set up.

  1. APIs & auth - APIs : Turn off all of the cloud ones that are default turned on, then turn On Analytics API. This will give you access to use the Google Analytics API.
  2. APIs & auth - Credentials: Click the Create new client id button,
  3. Select Web application
  4. Set the JAVASCRIPT ORIGINS to either http://localhost, or the host of your website. (see the picture below)
  5. Remove the text from the REDIRECT URIS (see the picture below)
  6. APIs & auth - Consent screen: Make sure to select an email address and set a product name.
  7. APIs & auth - Credentials screen copy the client_id that was generated for you.

This is all you need in order to get the Authentication working.

Developer Console Authentication

Embed API Demos and Sample Code

On top of giving us the normal documentation for the Embed API, Google has put up a Embed API Demos and sample code page. On this page you can find six different demos of how attractive you can make your dashboards look. Remember to Authenticate the API in the upper right hand corner of the demo. Once the demo is loaded and you are all impressed, scroll to the bottom look for the View source on Github link, where the code for all of the demos can be downloaded. I recommend you check out the Third-Party Visualizations demo, it is one of my favorites, it even shows current active users using the Real-time api! It even reloads the metric, that's right no more crashing real-time page on the website.

Authentication

I don't know if you have used any of the Google Analytics APIs before or not, but I have. One of the hardest things to do when you are first starting to use the API is understand how to Authenticate your script or application. The Embedded API makes this very easy.

gapi.analytics.auth.authorize({
  container: 'auth-button',
  clientid: CLIENT_ID,
});

That's it, that's all you need to create the Authenticate button. Once the user has accepted authentication, it's just a matter of detecting it and then loading the data into the dashboard.

gapi.analytics.auth.on('success', function(response) {
  <!--   …   -->
});

Conclusion

Just think of all the things we can do with the new Embedded API. Do you currently have a screen some place in the office just sitting there with the Google Analytics website loaded so that you can monitor activity on your website? Do you check the Google Analytics website from time to time throughout the day and have to go to different reports looking for the information you need?

Why not create your own dashboard to do this instead? Decide what metrics are most important to you and set it as your browser home page.

If these ideas interest you I recommend the Embed API - Getting Started tutorial. It's very easy to follow, you can just copy the code, it will work straight out the box. Once you have mastered that, check out the Github project for some more ideas on how to create the other graphs.

Remember to come back and show off your dashboards.

image 
Embed API
Developer Console Authentication

Optimize Your Site With Enhanced Ecommerce

$
0
0
Enhanced Ecommerce Product Impressions

Enhanced Ecommerce brings with it a new bag of toys for analysts and marketers. In this series of posts, I’ll cover what I believe to be some of the most beneficial features, how they are best leveraged, and some handy tips to make sure the implementation goes smoothly.

First up: surfacing complete visitor journeys using product impressions and detailed views.

Note that Enhance Ecommerce is a Universal Analytics-only feature. If you have not yet updated to the new tracking library (analytics.js), this will be a necessary first step before proceeding with implementing these new ecommerce methods.

Frankly, tracking only cart checkout steps and purchases tells us fairly little about the performance of products on a website. Yes, it will tell us crucial things such as where people drop off, which products make it all the way from cart addition to purchase, etc., but it leaves us blind to some of the most essential first steps our visitors take through the jungle that is our website.

By keeping track of complete user journeys in relation to our products we can start answering precisely the following questions.

  • What gave our visitors the idea to buy a product in the first place?
  • Why did they pick one product over the other?
  • Where are the “hot spots” in which to place the products we want to push?
  • Which product copy sucks?

Product Impressions

Enhanced Ecommerce product impression

The impression methodology allows us to surface data about where visitors saw a given product, where it was positioned in a list of several products, what context it was presented in, and so on.

CTR, click-through-rate, is not a metric reserved for ads. This sort of measurement is equally important on our website. Enhanced Ecommerce allows us to surface CTR for categories, brands, products, and internal promotions, giving us a critical data point needed to evaluate performance and optimize against it. To start tracking CTR, compliment your impressions methods with the click methods. The ratio between clicks and impressions will give us the CTR.

Enhanced Ecommerce product list CTR

By attaching impression data for each visible product to a hit, such as a Pageview or an Event hit, we can pass the following attributes to Google Analytics:

  • Name (required): The name of the product, which will show up under the “Product” dimension in GA.
  • Id (required): Our identifier for the product, which will show up under the “Product SKU” dimension in GA.
  • Price: Yep, it’s what it costs.
  • Brand: The brand name, such as “Apple” or “Samsung”.
  • Category: The product genre, such as “Apples” (if we’re selling fruit online).
  • Variant: The variation of a products, such as “Green” or “Red”. Or “Kind of big” if the product is iPhone 6 Plus.
  • List: This one is important and really brings something new to the table. Use the value of this attribute to surface in what context the product was presented on a page. For example “best sellers”, or some internal identifying number for a list of items. If we want to be really fancy, we could append what sorting method is used as the items are presented, such as “bestsellers: a-z” or “bestsellers: relevancy”.
  • Position: This is the good stuff. We can use whatever logic we want (I usually go left to right) to surface what position the product was in on a list of products. The important thing is that we understand where “1”, “2”, etc. means in relation to our website.

One benefit of attaching additional data to hits is that we can include everything in one http request to the GA servers. This is particularly important if we have high volumes of traffic and are subject to the lower hit-limitations of a Standard GA account. Note, however, that a single hit may not be larger than 8192 bytes, which is usually fine unless we are getting above 40 products or have very long descriptive values for the attributes presented above.

Boom, we can start the process of optimizing our website:

  • Which product is performing best where?
  • Are we placing our most profitable products in optimal positions?
  • Are there blind spots where important products get overseen?
  • Are some products cannibalizing others?

Tip: If sending this kind of ecommerce data through a separate hit in parallel with ordinary pageview hits, we probably want to mark the hit non-interactive. This means that it is not considered when calculating Bounce Rate (which would skew the metric towards 0). To do this, simply set the value of the “non-interactive” attribute to “true” in GTM or in the analytics.js snippet.

The Holy Rate: Buy-to-Detail

Ok, so we got the impressions down. How are we doing with presenting individual products? The detail method will give us our next clue in understanding how visitors perceive our products.

By tracking which items were looked at more closely, we can start connecting the dots between browsing and purchasing. From brands or categories to specific products, this gives us a very important relative metric: the Buy-to-Detail rate. It is what it sounds like: the number of purchases of X divided by the numbers of detailed views of X. So, a relatively high Buy-to-Detail rate tells us that visitors are more likely to buy the product after looking at its details than is the case for products with a low Buy-to-Detail rate.

The attributes we attach to this type of tracking are the same as the ones attached to product impressions above, except “Quantity”, which is usually included instead of “List” and “Position” (which are less applicable here).

  • Name
  • Id
  • Price
  • Brand
  • Category
  • Variant
  • Quantity

Buy-to-detail Rate

When comparing the winners to the losers, we will start identifying our mistakes as well as our bull’s-eyes:

  • What makes up a killer copy?
  • Which Unique Selling Points are most persuasive?
  • What might defer a visitor from buying a product?
  • What Call-to-Actions are most successful?
  • What other patterns in visitor behavior can we detect?

Based on these insights, we can start optimizing how we present our products, either through A/B-testing or by direct changes if we’re confident or if we lost the login credentials to Optimizely again.

Product page optimization

But wait, can’t we just count the number of pageviews for each product page and put that in relation to purchases? Of course we can, but what’s the fun in that, party pooper. Also, such a methodology is also dependent on a clear URL structure including all the attributes aforementioned that can easily be tied to the sale of each product. It will result in some serious hair tearing trying to match rows in Excel if we’re dealing with a large number of products (trust me).

In contrast, the Enhanced Ecommerce methodology will enable us to set up reports that are comprehensible and easy to manage in the long run. Since these new dimensions and metrics will now be available to us throughout the GA interface, we will be able to set up tailor-made custom reports showing us exactly what we need in terms of product performance.

Conclusion

The new Enhanced Ecommerce methods for product impressions and detailed views are extremely useful and will bring new data points to the table which are hard to hack together ourselves. In fact, it will surface data that we might not even be able to get anywhere else, as it ties back to all the segmentation power of Google Analytics (technology, behavior, attribution, etc.). Perhaps most important of all, it will give us actionable insights that we can work with to continuously improve the presentation of products and the performance of our ecommerce business.

image 
Enhanced Ecommerce product impression
Enhanced Ecommerce product list CTR
Buy-to-detail Rate
Product page optimization

Cross Device Behavior: Google Analytics + Mailchimp

$
0
0
Cross-Device Analytics With Email ID

Google Universal Analytics enables marketers and web analysts to move towards user centric measurement using the User ID feature. However, in practice, measuring cross-device behavior is harder than it seems, as in most cases it requires users to log in to a website, which is very often not a common behavior.

As Daniel Waisberg wrote earlier this year, the measurement industry biggest challenge is to find a way to get users to sign in to websites, that would "allow marketers to understand customers and provide them with amazing experiences."

In this post I will show a method to measure cross-device behavior without requiring users to sign in (it does require registration though). If you are not familiar with Universal Analytics and User ID, I suggest you to read Understanding Cross Device Measurement and the User-ID - a brilliant post by Justin Cutroni.

The challenge in cross-device measurement is associating different sessions in different browsers and devices to the same user. This is not a problem when the user voluntarily logs into our websites. Logging in is a common practice in websites such as Software-as-a-Service solutions, social media platforms, content websites restricted with a paywall, online casinos or banks; but for most websites users will simply go in and out without ever thinking about logging in. Therefore we have limited options to control whether users sign in or not. And it would probably be bad for business if we tried to force them to sign in just for the sake of data precision.

But an alternative solution could be to use our bulk email service to identify all of a user's devices and associate devices to their proper owner. Needless to mention that this solution would track only those users that have registered with a company at some point in time and provided their email addresses. It still requires registration, but not a log in.

Cross Device Analytics with Emails

Cross Device Measurement with Mailchimp and Universal Analytics

Below I provide an example of how this could be accomplished using Mailchimp. I am going to use a little bit of custom PHP and jQuery, a permanent cookie, and Mailchimp to track cross-device usage without any inconvenience for the user.

First create a mailing list in Mailchimp. I assume this is something every savvy marketer can do. And if you have never used Mailchimp, they offer great getting started tutorials.

In Mailchimp, create a custom signup form field for User ID with field tag USERID (see screenshot below). This is something we need when we send user IDs to Mailchimp and when we generate trackable unique links for subscribers’ emails.

Mailchimp Custom Signup

Now we just need to generate the unique user ID for every subscriber.
In my website template I use PHP to generate unique IDs for all subscribers. Before that I check if the user already has user ID assigned. In that case I will pass the old ID to Mailchimp instead of generating a new one. The code below is what I use:

<?php
 
// Check if user does not already have ID
 
if(!isset($_COOKIE['click'])) {
   
// Generate unique id
   
$prefix = rand() . "-";
   
$assignuserid = uniqid($prefix);
  }
  else {
   
// Assign user ID from cookie
   
$assignuserid = ($_COOKIE['click']);
  }
?>

Of course, because we are just starting to track user IDs every subscriber will get a fresh user ID. In the future, however, we might have subscribers that are already identified with an User ID. They might have subscribed to a different mailing list. Or purchased from our webstore.

Then we include our newsletter signup form on the page (learn more about embedded forms in Mailchimp). In the signup form we add the user ID as a hidden field, and then populate the field with an user ID and pass it to Mailchimp along with the user's email address.

<input type="hidden" value="<?php echo $assignuserid; ?>" name="USERID" id="mce-USERID">

From the user's perspective the signup form is just a regular one.

Email Form Analytics

At this point we should also add a permanent cookie to users that contains the freshly generated User ID. I have at least two options: I can add the cookie when the user clicks on "Subscribe" or when the user has actually went through the double opt-in process.

Here we want to use the former option, so I just add the jQuery.Cookie.js library and a piece of jQuery to the template that contains my signup form.

<script src="//cdnjs.cloudflare.com/ajax/libs/jquery-cookie/1.4.1/jquery.cookie.min.js"></script>
$( "#mc-embedded-subscribe" ).click(function() {
  $.cookie('click', '<?php echo $assignuserid; ?>', { path: "/", expires: 3600 });
});

Note: If you set cookies with jQuery you have to specify the path as "/". If you don't, the cookie can only be read within the same URL.

If you want to add the cookie only after the user confirms his/her subscription you have to go to Mailchimp, click Signup forms and select General forms. There you have the option to use custom URLs for signup thank you pages and Confirmation thank you pages. When you create a custom URL for confirmation pages, Mailchimp redirects users to that location after the confirmation email is clicked. On this page you can place the cookie to the user's browser.

Now, when people start signing up to our list we can see the user IDs collected on Mailchimp.

Mailchimp Analytics User-ID

Next, go to Google Analytics and enable the User ID tracking feature (if this is not already enabled). Below is a screenshot of the Admin panel where you will find a menu called “User-ID” under the Tracking Info section for each of your properties.

User ID Admin

At this point we need to comply with Google's User ID policy and somehow notify users about the tracking methods in use. In the terms it says that ”You will give your end users proper notice about the implementations and features of Google Analytics you use (e.g. notice about what data you will collect via Google Analytics, and whether this data can be connected to other data you have about the end user).” Perhaps an appropriate place to let them know about this would be to add a notice to the confirmation email.

Now we have our new User ID view on Google Analytics up and running but it does not collect any data yet, we need to implement the new tracking code to the website.

ga('create', 'UA-XXXX-Y', { 'userId': 'USER_ID' });

In practice we need to find a way to populate the USER_ID value (see code above) with the actual User ID, which will be either in the user's cookie or in the get-parameter of the newsletter subscriber's link. In order for this to work properly we need to add the following Google Analytics tracking script to our website.

<script>
  (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){
  (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o),
  m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m)
  })(window,document,'script','//www.google-analytics.com/analytics.js','ga');
<?php
 
// Get User ID if in cookie
 
if(isset($_COOKIE['click']))
    {
   
$userid = ($_COOKIE['click']);
    }
 
// Get User ID if passed in GET-parameter
 
else if (isset($_GET['click']))
    {
   
$userid = ($_GET['click']);
    if(!isset(
$_COOKIE['click']))
      {
     
setcookie("click", $userid, time()+3600*24*360*10);
      }
    }
  if (isset(
$userid)) {
   
// Print User ID tracking code for identified users
   
$gacode = "ga('create', 'UA-44227886-1', { 'userId': '%s' });";
    echo
sprintf($gacode, $userid);
    }
  else {
   
// else print standard tracking code
   
$gacode = "ga('create', 'UA-44227886-1', 'auto');";
    echo
$gacode; }
?>

ga('send', 'pageview');
</script>

Almost there. Only one step to go.

Now we need only to ensure that when we send Mailchimp campaigns their links contain unique user ID for every subscriber. Let’s suppose we want to send automated campaigns to our mailing list every time we publish a new blog post (read more about RSS driven campaigns).

We go back to Mailchimp, click Create Campaigns - dropdown and select RSS driven campaign. In the Template tab we select Code your own (template, that is).

Mailchimp Template

Here we use Mailchimp's tags to create the automated campaign from our RSS feed. This campaign is set to be sent every monday and the plan is to publish two posts per week, so the subscriber gets two fresh blog posts once-a-week.

After every link we have to add ?click=*|USERID|*. Mailchimp will replace the *|USERID|* part with the actual User ID assigned to subscriber. When the user clicks the link, she will be assigned with her unique user ID.

Every time she clicks one of my links with a different browser or device, that browser or device will be associated with the user through a permanent cookie.

Concluding Thoughts

Well, of course I might not get all the subscriber's sessions with this technique. And a creative mind can come up with a number of ways this will not be an accurate method for user tracking. But there are at least three reasons to assume that this will be a very effective way to track unique users.

Firstly, email tends to be the first application people install on their devices and people check compulsively their emails. I do that too.

Secondly, the user is likely to use different devices at different times of the day. Mobile during work hours, iPad after dinner. If I alter my mailing schedule, I'll have a chance to track all these devices.

Thirdly, as evidence shows, if emails match people's true interest, a very high percentage of subscribers will actually click the links. Here I am sending – or at least trying to send – interesting content with compelling headlines, Spamming people with mass produced ads will not be an effective method here.

image 
Cross Device Analytics with Emails
Mailchimp Custom Signup
Email Form Analytics
Mailchimp Analytics User-ID
User ID Admin
Mailchimp Template

Google Analytics Diagnostics: Actionable Data Quality

$
0
0
A Guide to Google Analytics Diagnostics

A few weeks ago I logged in to my Google Analytics account and saw the little yellow number 1 besides the bell icon in the top-right corner of my screen. As usual, I clicked on it to check what it was about, I like to keep my "inbox zero" when it comes to my implementation!

I was surprised with what I saw, as it was a pretty major issue with the website that happened as a result of a recent upgrade: the redirection from www.online-behavior.com to online-behavior.com stopped working, meaning that every single page on the website had a duplicate page. That's bad in many ways.

Immediately after seeing the notification (which you can see below) I sent an email to my developer asking for his intervention. One day later the problem was solved. This was definitely an actionable and timely implementation insight.

Google Analytics notification

The Power of Google Analytics Diagnostics

As shown through the example above, Diagnostics is a very powerful way to find out about issues with your configurations or settings, right in your face!

We all know about the importance of data quality, but how important is it, really? The answer to this question seems to be approximately US$296,477,289 only in the UK! At least that's the estimation made by the Experian Data Quality team in a 2014 research:

86% of companies admitted that their data might be inaccurate in some way. 44% of businesses said that missing or incomplete data is the most common problem with outdated contact information (41%) being the second most problematic.

Our research also concluded that 75% of businesses are wasting 14% of revenue due to poor data quality. This equates to a staggering £197.788m wasted revenue across UK businesses.

With that in mind, it is easy to make your calculations to understand what would be the ROI of spending 30 seconds checking your Diagnostics every time you log into Google Analytics. And I write "every time" because Diagnostics is a real-time tool, it monitors your data & configuration settings on an ongoing basis; my example above shows just that. Diagnostics don't just tell you what's wrong today, but continually. For example, if a webmaster forgets to tag 50 new pages that launch 3 months from today, Google will keep an eye out for this.

Make no mistake, your data-driven decisions are as good as your data source, and your data source is as good as your implementation; therefore, your decisions will be as good as your data implementation - simple logic :-)

Below I will discuss what you should be looking for and how to navigate the Google Analytics Diagnostics feature to get the most out of it.

It All Starts With The Bell (Mr. Jingles)

Analytics notification bellBasically, Diagnostics monitors two types of implementation issues: data quality and configuration settings. Each of these has a large number of criteria being checked, some more critical than others. That's why when you log into Google Analytics you may see three types of notifications (in order of importance below):

  1. Red notification: high-priority issues including untagged pages, bad filters, abundance of self referrals, (not set) in AdWords Reports, double-tagging, and others.
  2. Yellow notification: unresolved issues including AdWords clicks vs. sessions discrepancies, duplicate campaign parameters, auto-tagging disabled, oversized "Other" channel, and others.
  3. Blue notification: feature recommendations such as creating a goal, excluding internal IPs, linking to Webmaster Tools, using remarketing, using segmentation, using annotations, and others..

Note that the number on the bell will show you only the highest priority notification when you log in, i.e. if you have 6 red, 2 yellow and 2 blue notifications, you will see only a red 6 on the bell notification, but when you click on the bell you will see all of them.

The Anatomy of a Diagnostic

Once you click on the bell you will see an interface with all your notifications similar to the image below. You will note that there are two major sections in the Diagnostics list: Active notifications and Archived notifications. You might look at the active section as your "inbox", or "to-do list"; the archive is the "done" or "worry later" list - so the objective here is to keep inbox zero! We will talk about each of them below.

But before that, I would like to give a honor mention to a small icon in the top-right of the screenshot below, also known as the "copy to clipboard" button. It so happens that often the person viewing the message is not the one who can act on it. If that's the case with you, this button will make it easy for you to email notification(s) to your webmaster. Or paste into a spreadsheet to assign various owners and track its course. Or to put into a notes doc, or maybe just to copy the notifications to for the pure pleasure of doing it!

Analytics Diagnostics

Active Notifications

This section, prominent by the white background, lists all the unresolved notifications. It also shows the priority of the notification (using the colors explained above) a short explanation of what is the issue and a list of links to help you take action. Since notifications have different natures, the types of action differ between them. Below is a list of the most common actions you can take.

  • Ignore: If you are aware of the issue being reported and it is not important enough for you implement it, you can choose to ignore the notification for 1 week, 1 month, 3 months, 6 months or All time. If you do that, the notification will be archived and visible on the section below, but you will be able to restore it later.
  • Check again: If you fixed the issue you can ask Google to check again in order to confirm that you've done it properly. This option will be available until you resolve the issue.
  • Details: Clicking on details will lead you to another screen where you can read more about the notification and how to solve it. Usually in the new screen you will also have an additional link to Learn More that will often direct you to relevant resources on the Google Analytics Help Center or Developer Documents.
  • Action links: Those links are present in diagnostics where there's a setting that can be adjusted to fix the problem. For example, if you have a bad filter, you will see an "adjust filter" leading you directly to the filter, if you have a bad view setting, you will see an "adjust view setting" leading you directly to the view settings.

Archived Notifications

When it comes to archived notifications, you will have three types: Resolved, Ignored and Pending verification.

There isn't much you can do with Resolved notifications, they have been resolved. You can only see the details about them.

Pending verification will be shown in those notifications where you can clicked on check again, explained above. Those issues are in the queue to be verified by Google, and when they are checked again they are either resolved (stay in the archive) or are going back to the active section.

As for Ignored notifications, you will have the option to Restore them into an Active notification or just check more details about them. This option has two major use cases that might be very handy:

  • A teammate ignores something thinking it's not important, but you realize it is.
  • You archive something because it's not important at that point in time. But your business strategy shifts and it becomes important in the future.
  • You ignore something believing it has been implemented, but you notice later it was not (for these cases, it is always better to click on "Check again" instead of "Ignore").

Lesson: it is always better to resolve an issue rather than ignore it!

Concluding Thoughts

In summary, Diagnostics are great, they can save you a lot of headaches and money; it will constantly monitor your data health for you and let you know if anything strange comes up. In addition, if you are not a savvy Google Analytics user, this feature will have a huge value as it will help you prioritize your development, focusing on tasks that are more important first. And another helpful touch is that it will send you directly to whatever setting you need to change, time saver!

Data quality is an issue that affects all companies, independently of their savviness, size or industry. Even if you are a data jedi, make sure to take a few seconds to check this feature, it may turn you into a hero.

Big shout-out to Matt Matyas, Google Analytics Product Manager responsible for this feature, for his comments and insights on this post.

image 
Google Analytics notification
Online Behavior
Analytics Diagnostics

Google Tag Manager Injector: Painless Tagging

$
0
0
Google Tag Manager Injector: Painless Tagging

Google Tag Manager is truly incredible. The ability to make additions or alterations to your tracking and marketing tags without bothering your developers is a huge game changer in the digital industry. However one small problem still remains... Google Tag Manager (GTM) require at least one additional piece of code on your site before you can start to utilise this ground-breaking way of doing things.

This article provides a step-by-step guide to Tag Manager Injector (TMI), a free Chrome extension designed to make the transition to Google Tag Manager as quick and painless as possible. The extension gives you the ability to run/preview a GTM container on your site without needing to add any JavaScript to your pages. In this article I discuss how this could benefit you and your business and provide a real life example deployed at Eurotunnel.

Download the extension here!

To illustrate the way businesses often implement Google Tag Manager, let's take a quick look at a typical GTM implementation process, as shown in the diagram below.

Typical Google Tag Manager Implementation

The longest steps here by a country mile are B and C and you're mostly reliant on the former being completed before you can start to configure your tags in GTM. Imagine the time you'd save if you had the ability to run these two steps in parallel...

Turns out it's your lucky day!

With Tag Manager Injector (TMI), you can begin configuring your tags as soon as you've created your GTM account and container. So let's introduce the Tag Manager Injector and take a look at our new process: Short 'n' sweet!

Google Tag Manager Process

Success Story

Google Tag Manager case studyWhen we eventually decided to try this new extension out on a real client project the results really spoke for themselves. Top of the list was our client Eurotunnel who had just come to us for help with the implementation of Google Tag Manager on their site.

It was the first case we'd had where a container was fully configured and tested before the GTM container snippet was actually live. This brought the completion of the project forward by a whole month, allowing Eurotunnel to start collecting the data they wanted a lot earlier than we anticipated.

A more efficient transition to GTM isn't the only use for TMI. If you're looking to leave the dark ages and finally join the tag manager club but are undecided about which one to use, this extension will allow you to preview GTM and experience this amazing tool first-hand – no developers required! This is much better than relying on blogs and reviews etc. to tell you which solution is best for you. While we're on the subject though, I'd highly recommend GTM ;-)

Getting Started

In order to use TMI, follow this link to the Google Chrome Web store and add the extension to Chrome (that's a Chrome extension, so yes, you have to do it on Chrome):

If you haven't done so already, you'll need to create a GTM account and container. You can find instructions on how to do this in this article.

Make sure you make a note of your GTM container ID. To find your GTM container ID, login to GTM, select an account then select a container. There you should see the container ID next to the container name at the top of the interface (see screenshot below).

Tag Manager Container

Navigate to your website and click the TMI icon in the top right corner of your Chrome browser (just besides the URL field) to open up the interface. You should see a screen similar to the screenshot below:

Tag Manager Injector start

Paste your GTM container ID into the first text field and hit START. You should then get a message telling you that the container is active:

Tag Manager Injector active

You can easily tell whether or not TMI is active as the extension icon in the top right will show either green for active or red for inactive .
To deactivate TMI, simply open the interface again and hit STOP. You'll get a message telling you that the container is now inactive:

Tag Manager Injector inactive

Warning: Once you activate a container in TMI, it's injected onto any sites you visit in your browser session until deactivation. It's recommended that you do any other browsing in a separate Chrome session while TMI is active.

Preview & Debug Your Container Using TMI

GTM's preview and debug mode operates fine when using TMI allowing you to test out container configurations as normal. Instructions on entering preview and debug mode can be found at the Help Centre.

Once you've entered preview and debug mode, activate TMI for the container that you are debugging then navigate to your site. You should see the quick preview frame appear on the page as shown in the example below:

Debug Google Tag Manager

That's it! If you have any questions or feedback, feel free to drop a comment in the comments section below.

image 
Typical Google Tag Manager Implementation
Google Tag Manager Process
Google Tag Manager case study
Tag Manager Container
Tag Manager Injector start
Tag Manager Injector active
Tag Manager Injector inactive
Debug Google Tag Manager

Dashboard for AdSense Performance Analysis

$
0
0
AdSense Performance Dashboard

Not long ago I published a post on the Google Analytics blog about the power of using AdSense and Google Analytics together. In that post I went over the integration between both tools, described the reports available after linking your accounts, and also proposed techniques to optimize AdSense revenue using Google Analytics.

In this post I offer a dashboard that can be used to measure your most profitable channels, pages and demographics when it comes to AdSense revenue. You can add the dashboard to your Google Analytics account by following this link (make sure to be signed in to your account).

Below is a screenshot of the dashboard. As you can see by the different colors, each column has a theme; the first column shows overall performance metrics over time (widgets 1-4), the second focuses on demographics (widgets 5-7), and the third shows important information on behavior and acquisition (widgets 8-10). I discuss the widgets in each of the three themes.

AdSense performance dashboard

Overall Performance Trends

1. AdSense Revenue vs. Total Sessions - shows the overall performance of the website. If you see diverging trends on the lines, it means that something worth checking is happening, drill down into it.

2. Ads Clicked vs. Ads Viewed - shows a trend of the absolute number of ads people are viewing on the website and how many of them are being clicked.

3. AdSense CTR - this widget summarizes the one above, the Click-Through Rate (CTR) is the percentage of page impressions that resulted in a click on an ad. You definitely want to see an upwards trend.

4. AdSense eCPM - the AdSense eCPM is the estimated cost per thousand page impressions. It is your AdSense Revenue per 1000 page impressions, a great performance metric.

Demographic Segments

Please note that the widgets below depend on having Demographics enabled for your account, learn more about it on this Help Center article.

5. AdSense Revenue by Country [Male vs. Female] - this stacked bar chart shows the AdSense revenue per country and each bar is divided between Males (blue) and Females (green). As we can see in the example above, Australia and Germany are heavily biased towards man, so a good tactic might be to find content that is particularly appealing to women and promote on the homepage of those countries.

6. AdSense Revenue by Age [Male vs. Female] - this stacked bar chart shows AdSense revenue per age group and each bar is divided between Males (blue) and Females (green). In the example above we can see that very old and very young visitors are heavily biased towards men, but all other age groups are biased towards females, especially 55-64. In the same spirit as above, it might be interesting to run a content analysis and adjust content strategy based on that.

7. AdSense Revenue by Affinity Category - this bar chart shows the AdSense revenue per affinity category. This information might be interesting to understand which groups are the most interesting in terms of revenue and might help driving the content strategy for the website.

Behavior & Acquisition

8. Revenue and Ads Clicked by Device - as we all know, we live in a mobile planet, so it is important to check if your ads are being clicked and generating revenue in all devices in a similar rate.

9. AdSense Revenue by Page - this table is a great indicator of which content is performing well and how much time you should invest in each topic.

10. AdSense Revenue by Channel - this table shows which acquisition channel is bringing the most profitable visitors.

Closing Thoughts

If you find this dashboard useful, you can download it from this link (make sure to be signed in to your account). If not, you can find dozens of other great dashboards in our Solution Gallery.

You can also take a look at Google Analytics dashboard guide which discusses dashboards best practices as well as the features available when creating and formatting dashboards.

image 
AdSense performance dashboard

Testing Statistical Significance On Google Analytics Data

$
0
0
Testing Statistical Significance On Google Analytics Data

This article walks you through GA Effect, a web application that helps you identify whether events happening on your Google Analytics data are statistically significant or just pure chance; in other words, it separates signal from noise.

I will focus here on how to use GA Effect and interpret the results, but if you are interested in building your own online statistics dashboard using R, take a look at my previous post, where I explain how this application was built.

Is This Event Statistically Significant?

You may recognise the situation: you've implemented some SEO improvements 3 months ago and you're now checking if and by how much your revenue has improved since then. You log in to Google Analytics, but instead of a huge peak of SEO traffic, you are instead greeted with a slight hump. You think it may be due to your efforts, but can't definitely say. The marketing team is saying the increase is due to a display campaign they did 6 weeks ago that influenced your brand search, but you think otherwise, and you need to prove it to justify releasing budget for more SEO work.

Or how about this: you're in charge of a travel website, and your country has just won the Eurovision song contest. Everyone is saying this is great news for the country's tourism and you should see an increase in bookings. Could you judge if this is the case, or is any increase you see that year just due to seasonal variation? Maybe its just the weak currency?

What both these scenarios have in common is a certain event happening, which may or may not have had a positive effect. Wouldn't it be nice to know for sure that the event was significant, backed up by more than just instinct or opinion? That way you could be more data-driven regarding future investments. Further, if you do find that event to be significant, it would be nice to have an idea of how much it moved the needle, so you can judge its overall value and ROI.

The situations discussed above are great examples of what GA Effect attempts to answer using a Bayesian structural time-series method. It provides a Yes or No answer to the question "Is this event significant?" The tool also gives a metric value to those events.

How GA Effect Works

GA Effect is coded in R and takes advantage of the CausalImpact package released in 2014 by Googler Kay H. Brodersen. The package uses Bayesian structural time-series to build an estimate of what your metrics would look like if no event happened, and then compares that with what actually happened (image below from official package page). If the model is correct, then by comparing the differences between its estimate and reality, you can get an educated guess of the event's value.

Causal Inference analysis

If you were looking to do similar analyses without Bayesian time-series, a sensible approach would be to take the average value before the event, and compare it with the average afterwards. The CausalImpact model builds on top of that by using Bayesian statistics to give you a better guess at those averages, as well as accounting for season and time.

The statistical significance tests are set to 95% confidence level. That is to say, if you do find significance, an experiment run 100 times is expected to see the impact 95 times. This means that if all your assumptions are correct, you can be a lot more confident in that event's effect. Bear in mind several assumptions need to hold for this to be the case, which we discuss later - not following those assumptions may give you incorrect results.

Use Case: Impact of SEO Changes

The two examples discussed previously were used to test the app, along with a few others. Below I will discuss in detail how GA Effect was used to analyze the impact of SEO changes on a website.

It is important to keep in mind that you should always have a question in mind before using the app. This may seem obvious, but don't simply go "shopping" for significant events, regardless of their meaning. Plan it like you plan an experiment - pick a hypothesis, judge what you would expect to see if the hypothesis was true, then use the app to test that hypothesis. So, here is the hypotheses we will look at below:

Did changing the title tags impact this website's SEO traffic, and if so by how much?

Below are the steps you will need to follow in order to check whether the hypotheses above is statistically significant or not.

1. Go to https://mark.shinyapps.io/ga-effect/ and authenticate with a Google account that has access to the correct Google Analytics View.

2. Pick your View from the selection list.

3. Select a date range. Tips on picking a good date range are below, but a rule of thumb is 7:3 ratio of pre-event:post-event.

4. Pick your segment. This will show all your custom segments, but for now we will pick "Organic Traffic" as we want to see SEO impact.

5. Pick your metric. We will just choose sessions, as conversions such as revenue or goal could be affected by your website's conversion rate performance. You should now see a plot of your selections in the middle of the screen - it shows date and your metrics plotted just as you would see in Google Analytics reports itself. The plot is interactive - try click-dragging to zoom in for fine detail, to help choose when your event was.

Statistically Significant Google Analytics

6. You will also see a vertical line labelled "Event" in the screenshot. Move the line to when the SEO changes occurred, this date will be used to calculate the statistics.

7. The screenshot above shows the setup. In our case, the SEO changes went live on the August 01, 2014, so that date has been chosen in the field beneath the graph.

8. Most datasets also have a seasonality, especially e-commerce websites. Two are supported in this version, Weekly and Monthly. Other seasons such as Annual may be added in the future. We're choosing weekly as we can see weekends are quieter than weekdays.

9. Once you're happy, click over to the Results page in the top left hand corner.

10. The app will think a bit, then show you the results, as seen in the screenshot below.

Google Analytics Statistics

Here is an explanation of the results shown in the screenshot above:

  • Top Left: An overall confirmation on whether the event seemed to have a significant effect or not.
  • Top Middle: An estimated magnitude of how much extra SEO traffic was seen in this example.
  • Top Right: An estimated % change on SEO traffic. In this example the title tags have contributed 16% more SEO traffic over the period.
  • Middle Plot: This interactive graph shows the model's expectation of the SEO traffic vs what actually happened. The light green area is the 95% confidence area of the estimate - a guess range of where the traffic should be. The blue line is reality. If the blue line is consistently over the green, that indicates a lift. It also includes a zoom window at the bottom; the default shows one week before the event so you can focus on the results.
  • Bottom Plot: This is the same time-range, but shows the model estimate minus the guess, adding up over time. For instance, it estimated that by September 07 the SEO changes have added 50,000 more SEO visits. As time goes on, it gets less and less certain due to natural random variation, so the blue shaded area showing the 95% confidence level gets wider and wider.

Underneath the plots is the raw output of the CausalImpact model, so you can examine the details. It includes a nice written report that tries to give some insight into the results, plots that the upper graphs were based upon, and a table of the statistics computed.

That's pretty much it. You can redo the process by going back to the Setup page or Start Again to re-authenticate. Note also that there is a notification log top right, so you can keep track of what you have been doing.

This is just one application, not necessarily the most interesting to you. Give it a go and check the most relevant applications to your website using your own data.

Model Assumptions & Best Practices

Just blindly trusting statistics is dangerous, and although GA Effect can be a powerful tool, don't put faith in its results without putting care into its model assumptions. There are several assumptions that CausalImpact makes:

  • The event you are testing did not influence pre-period as well as post-period. There needs to be a definite start date.
  • The metrics post-event are correlated to the metrics after the event. Things like the website hosting or other big influences need to be as constant as possible.
  • The event itself must not be correlated to what is measuring it; e.g. testing conversion rates after your website speed improvements went live would be problematic, as website speed also correlates with conversion rates.
  • Examine how well the outcome can be predicted before the beginning of the intervention. Try running a dummy event before the real one - you should see no significance.
  • The date range needs to be long enough before the event to give something for the model to train on, and enough after the event for the effect to be seen - but not too long that other events start to take hold. Think about what effect you are measuring, and amend accordingly.

Taking it Further

GA Effect is really just a wrapper around the CausalImpact R package, and only touches the surface of what it can achieve. For instance, only one metric is used for predictions, when multiple can be used (say Google trends for a brand term in the SEO example). You can also provide custom seasonality, and customise the model itself via its underlying bsts package.

The GA Effect app itself wouldn't be possible without the incredible work of the RStudio team, with products such as Shiny, shinyapps.io and dygraphs making the app as it is. Last but not least the GA connection is courtesy of the rga() package by Bror Skardhamar.

If you've not used R before now, its a great time to start as all these people and more are working hard to make it more accessible every day. Once you're hooked, you'll never open an Excel sheet again :) I write about R and Web Analytics and how it helps with my daily work at my blog, so join me there to follow my progress and hopefully help with yours.

image 
Causal Inference analysis
Statistically Significant Google Analytics
Google Analytics Statistics

Guide To The Google Tag Manager API

$
0
0
Guide To The Google Tag Manager API

In October 2014, Google Tag Manager V2 was officially released. In the wake of this major UI and functional overhaul, the developer team also published an API service that allows anyone to create value-adding components on top of Google Tag Manager (GTM).

The API really opens a whole new world of interaction with your GTM accounts and containers. Tired of waiting for some feature to be developed by Google in GTM? Well, build it yourself!

In this article, I'll walk you through how the API works, what you can do with it, and I'll wrap things off with a simple example, where I use Python to list the GTM accounts and containers your Google ID has access to. Fasten your seat belts!

Available API Services

The API gives you a number of services you can invoke with an authenticated user. Most of the services support typical CRUD (create, read, update and delete) operations, and communications with the API are performed using the HTTP protocol.

Google provides a number of useful client libraries which make it very easy for you to build the service calls.

The services you can access through the GTM API are:

google tag manager api services

  • Accounts: Lets you access GTM accounts the user has access to.
  • Permissions: Allows you to modify permissions for GTM accounts and containers. Also lets you view and modify permissions for a given user.
  • Containers: Gives you full control over container resources in a given GTM account. You can, for example, create a new container, delete a container, or list all containers under an account (that the user has access to).
  • Container Versions: Perform versioning operations on a container, such as restore, publish and delete. Also gives you access to the entire version resource, together with all tags, triggers, and variables within the version.
  • Tags / Triggers / Variables: Gives you full control over the assets in a given container. Also lets you create new assets, delete assets, update them, etc.

In short, you can do everything that's available through the UI, but as an added bonus you can create your own solutions that facilitate things that are not available in the UI, such as mass operations, cloning assets from one container to another, and so on.

For example, to clone a container from one account to another, you would need to write a solution that does the following:

  1. Get the container version you want to clone.
  2. Create a container in the target account, using the container version as the body of the request (so that the new container has the same settings).
  3. For each tag, trigger, and variable in the container version to be cloned, create a new tag, trigger, and variable in the target container, using the respective asset as the body of the request.
  4. Operations like these seem complex, but they are actually very trivial calculations for a client application communicating with a service endpoint.

    How The Google Tag Manager API Service Works

    Google Tag Manager API is very similar to all the other Google APIs out there. That is, it uses OAuth 2.0 for authentication, and it provides you with a client library that you can use to build the requests to the web service.

    If you've never worked with Google APIs before, you might want to follow this guide for the Google Analytics API. It has a very nice walkthrough of all the things you need to do to get started with using Google APIs. The steps you need to take, in a nutshell, are:

    1. Register a new project in the Google Developers Console, and make sure you've enabled access to the Google Tag Manager API.

    google developer console

    2. Create new credentials for the application. What type of credentials you want to create depends on whether or not you're working with a web application, an installed application (e.g. command-line), or with a service account. If you want to try out the command-line example at the end of this article, make sure to create credentials for a native application.

    application credentials

    3. Download and install the right client library, depending on which programming language you want to use. The example at the end of this article will be using Python.

    4. In the application code, you will first need to create a service object, using authorization credentials the native application requires. Depending on the scopes you choose, the user will need to authorize the application to access their Google Tag Manager data.

    google tag manager data access

    5. Using this service object, you can then proceed to call all the services the Google Tag Manager API provides. The service object will be valid for as long as the user doesn't revoke authorization to the application. Naturally, you might need to store the access credentials in some file to ensure that the user doesn't need to provide authorization every time they run the command-line application.

    The most difficult thing to grasp in this process is that of authentication and authorization. The OAuth 2.0 protocol isn't difficult to understand, but it has multiple layers and the process is ambiguous to many. But if you follow the steps outlined in the next chapter, you should have a better idea of how authentication works in the Google API universe.

    Simple Command-Line Application

    Command-Line ApplicationThe application we'll create is a simple Python program, which defines the service object, and then proceeds to get the names of all GTM accounts your user has access to. For each account, it also gets the names of all containers you have access to. Everything is output into a text file, which will look like the image on the left.

    So, let's get started.

    I'm assuming you've read the previous chapter. At this point, you should have a new project created, and you've also created credentials for a native application. You should also have the Python client libraries installed in your development environment.

    If you're using Mac OS X, and you've installed the client library following the instructions, you won't even need to setup a complicated development environment. All you'll need is to edit the Python file (.py) directly in a text editor, and then run the application from the command line!

    First things first, download the client secret JSON from the Google Developers Console, and store the JSON file as client_secrets.json in the directory you've created for your application. This file will link your application with the project you created in the API console.

    JSON File

    To kick things off with the code, you will need to create a new text file that will become the Python application. I've used the imaginative name gtm2txt.py, but you can choose whatever name works for you.

    First things first, we'll need to import a bunch of modules to help us work with the GTM API:

    import argparse
    import sys

    import httplib2

    from apiclient.discovery import build
    from oauth2client import client
    from oauth2client import file
    from oauth2client import tools

    These modules are required for the rest of the code to work. Next, as this is a Python command-line application, we'll need to define the main method and invoke it with any command-line arguments:

    def main(argv):
        # Content coming soon

    if __name__ == '__main__':
        main(sys.argv)

    This is required stuff for any native Python application.

    All the rest of the code comes in the main method, so remove the line # Content coming soon, and let's get on with the code!

    # Define variable constants for the application
    CLIENT_SECRETS = 'client_secrets.json'
    SCOPE = ['https://www.googleapis.com/auth/tagmanager.readonly']

    First, we'll create some constants. If you followed my instructions, you downloaded the client secret JSON from the Google Developers Console, and you renamed it to client_secrets.json, saving it in the same directory as your Python application.

    Next, we're defining a scope for the application. As this is a very simple app, all we'll need is read-only rights to GTM accounts and containers. You can view all the available scopes behind this link.

    # Parse command-line arguments
    parser = argparse.ArgumentParser(parents=[tools.argparser])
    flags = parser.parse_args()
       
    # Set up a Flow object to be used if we need to authenticate
    flow = client.flow_from_clientsecrets(
        CLIENT_SECRETS,
        scope=SCOPE,
        message=tools.message_if_missing(CLIENT_SECRETS))

    These lines set up a Flow object. In the world of Google APIs, a flow is how the credentials are passed from the application to the web service and back, after a hopefully successful authentication. The Flow object is built using command-line flags, but since this is a very simple app, we won't be actually using any flags. As you can see, the client_secrets.json file and the scope are passed as arguments to the Flow object.

    # Prepare credentials, and authorize the HTTP object with them.
    # If the credentials don't exist or are invalid, run through the native client
    # flow. The Storage object will ensure that if successful, the good
    # credentials will be written back to a file.
    storage = file.Storage('tagmanager.dat')
    credentials = storage.get()
    if credentials is None or credentials.invalid:
        credentials = tools.run_flow(flow, storage, flags)
    http = credentials.authorize(http=httplib2.Http())
       
    # Build the service object
    service = build('tagmanager', 'v1', http=http)

    These are very important lines. First, the application checks if a file called tagmanager.dat is located in the application directory. This is the file where we'll save your credentials after a successful authorization. If the file isn't found, or the credentials within are invalid, the run_flow() method is invoked, which opens a browser window and asks for your authorization to the scopes you've defined. A successful authorization returns a credentials object, which we then use to authorize all API requests.

    Finally, the service object is built, using the credentials we got back from the authorization flow.

    This is how OAuth 2.0 works. Authorization is requested, and if it's provided, a service object can be built with the credentials.

    Now that we've built the service object, we can start calling the API and performing tasks.

    # Get all accounts the user has access to
    accounts = service.accounts().list().execute()
       
    # If the user has access to accounts, open accounts.txt and
    # write the account and container names the user can access
    if len(accounts):
        with open('accounts.txt', 'w') as f:
            for a in accounts['accounts']:
                f.write('Account: ' +
                        unicode(a['name']).encode('utf-8') +
                        '\n')
                # Get all the containers under each account
                containers = service.accounts().containers().list(
                    accountId=a['accountId']).execute
                if len(containers):
                    for c in containers['containers']:
                        f.write('Container: ' +
                                unicode(c['name']).encode('utf-8') +
                                '\n')

    The very first command we're executing shows exactly how Google APIs work. As you can see, we're invoking a number of methods of the service object, which we built around the Google Tag Manager API.

    So, to get all the accounts the user has access to, you will have to run the accounts().list() query method against the service object. It takes no parameters. I know this, because I've consulted the instructions for invoking the list() method in the Google Tag Manager API Reference Guide. The execute() command in the end runs the actual service call.

    Because it's a variable assignment, I'm actually storing whatever this API method returns in the object accounts. By looking at the reference guide again, I can see that the API returns a JSON list object, with all the accounts I have access to as objects within this list.

    As I know now what the response resource is like, I can confidently first check if there are any accounts, using Python's built-in len() call to check the length of the list. Next, I can iterate over all the account resources in this list, storing the value of the name property in the text file. The reason I'm re-encoding the value in unicode is because I might have access to accounts that have been created with a character set not supported natively by the default encoding.

    Look at the call I'm doing on the service object next. I'm invoking the accounts().containers().list() method to get a list of containers the user has access to. This time, I will need to add a parameter to this call, namely the account ID I want to get the containers for. Luckily I'm in the process of looping through the accounts returned by the first API call, so all I have to do is send the accountId property of the account currently being looped through. Again, I can check the detail for the containers().list() method from the reference guide.

    And that's the application right there. Short and sweet. Once you have the code in a Python file, you can run it with the command python gtm2txt.py.

    You can download this application code from my GitHub repository.

    It's not difficult, if you have a modest understanding of Python, if you understand how authorization is passed to and from your application to the web service, and if you're prepared to consult the reference guide for the API multiple times while debugging your application.

    What to do next?

    Well, the world is your oyster. The API opens up a lot of possibilities. I've created a set of tools for accounts created in the new Google Tag Manager interface. The tools are free to use, and can be found at v2.gtmtools.com. The toolset is built on a number of API methods similar to the ones we've explored today. Be sure to check out the user guide I've written as well.

    Feel free to use your imagination with the API. Here are a couple of ideas you might want to try:

  • A branch/merge publishing workflow for containers.
  • A diff for container versions, which shows what changed, what was removed, and what was added between two container versions.
  • A tool which lets you add a trigger condition to multiple tags at once.
  • An app which is similar to the one we just created, but which outputs the details into a .csv file, and includes also account IDs and container IDs.
image 
google tag manager api services
google developer console
application credentials
google tag manager data access
Command-Line Application
JSON File

Best Practices For Table Filters In Google Analytics

$
0
0
Advanced Table Filters on Google Analytics

Table filters are a very powerful feature in Google Analytics. They allow you to perform deep analysis from within the interface. From experience, I can say that many Google Analytics users don't know how to effectively use this feature, a missed opportunity for sure!

Most reports in Google Analytics contain one dimension and several metrics by default. However, it is easy to add a secondary dimension to your report, as seen in the screenshot below. And table filters really help you to slice and dice through your reports and their building stones - metrics and dimensions - in a more efficient way.

Table Secondary Dimensions

In this article I will guide you through table filters and a few related topics.

View Filters vs. Table Filters

I would like you to understand the basic concepts first, this will make it easier to get the complete picture.

View filters are applied before the Google Analytics data is saved in your account. They are set up in the Admin interface and will apply to all data in the view, forever. I won't go into much detail and examples about view filters here. Read this guide to Google Analytics filters for in-depth information on view filters.

Table filters work in a different way, they are ad-hoc segmentation filters. Contrary to view filters, these filters don't permanently affect any data from your reporting view. You can think of it as filters in Excel. There are two types of table filters:

  • Standard table filters allow you to filter data for the first dimension in your report and this can sometimes be limiting.
  • Advanced table filters are more powerful as they allow you to filter on all available dimensions and metrics in your report.

By now you understand the difference between the different filter types that are available in Google Analytics, here is a quick overview.

Filter Comparison Table

A good knowledge of regular expressions is very handy when working with both filter types. I recommend learning at least the basics.

Filtering Standard Reports with Advanced Table Filters

Below I discuss how to create advanced table filters on Google Analytics.

1. Click on the advanced link to the left of the search field

Advanced Filter link

2. Choose your filter

Advanced Filter Levels

  • First level: include or exclude
  • Second level: dimension or metric
  • Third level: matching type
  • Fourth level: filter field

3. (optional) Filter two dimensions simultaneously
You have the option to filter on two different dimensions at the same time if you add a secondary dimension to your report, as shown below. Here both Browser and Source / Medium are visible in the Google Analytics report and advanced filter field.

Filtering Google Analytics using two dimensions

Note: the advanced table filter is limited to the dimensions and metrics that are included in your report. Also, by default the advanced table filter is linked to an AND equation. By smartly working with regular expressions you can build an OR equation as well. Google Analytics accepts RegEx in both standard as well as advanced table filters.

Concluding Thoughts

  • Make sure to understand the difference between view filters and table filters
  • Start every data deep dive with a business question; what are you trying to solve?
  • Advanced table filters provide you with a great option to filter out anomalies before exporting or presenting a report
  • Start with one or two advanced filters and expand if you need to
  • Make sure enough data on the level of dimensions remains to draw statistically significant conclusions.
image 
Table Secondary Dimensions
Filter Comparison Table
Advanced Table Filter
Advanced Filter Levels
Filtering Google Analytics using two dimensions
Advanced Filter link

Using Google Analytics to Understand Your Customers

$
0
0
Customer Analytics

There are multiple analytics tools that excel at Customer Analytics and fill gaps in areas where Google Analytics may not have excelled. But over the last year or so, Google Analytics has been consistently pumping out new updates and has some solid offerings to help you understand and analyze your customers more effectively and close some of those gaps. Some highlights from these offerings are Flow reporting, Enhanced E-commerce, User ID, Data Import, and improved Audience Reporting.

In this post I will discuss each of those separately and provide a short summary of what you should know about them and why you should be using each of those features.

Behavioral Flow Reporting

Behavioral Flow reporting has been around for quite sometime, and is incredibly helpful when trying to uncover how your customers are behaving on your site. Flow reports display how a customer moves through your site one interaction at a time. What makes flow reporting especially powerful is the ability to alter your starting dimension. You can choose from any number of dimensions to see how users are traveling through your site from specific sources, mediums, campaigns, geographical locations and more!

Behavioral Flow reporting

While studying the paths your customers may take, you can uncover more detail by highlighting specific lines of traffic or viewing segments of the dimension you're investigating. Combine this with an advanced segment and the sky's the limit.

Enhanced E-commerce

One of the largest announcements to come out of Google Analytics last year, is Enhanced Ecommerce. Moving past basic transactional detail, Enhanced Ecommerce provides analysts with even deeper insights surrounding the customer journey. Included in Enhanced Ecommerce is the ability to track all phases of the purchase process, upload product data, refund data, and a slew of new reporting dimensions and metrics. With this new functionality, analysts can easily answer questions like "Where are my customers falling off in the transaction process?""Which of my products are viewed most frequently?" and "What products are most frequently purchased or abandoned?"

Enhanced E-Commerce

Similar to Behavioral Flow reports, Shopping Behavior Analysis provide an overarching view of your customer's journey from site entrance to transaction completion. Using the visual (above) analysts can quickly identify where the highest amount of fallout during a site session is occurring. The steps within this report are customizable to best fit your website needs, and are based on your site's implementation.

User ID

It's no secret that consumers have overwhelmingly transitioned to a multi-device lifestyle. Home computers, work computers, smart phones, tablets and even gaming systems all provide individuals with a means to view online content. Historically, visiting sites from these different venues resulted in a unique user for each device. To create a more complete picture of the user, Google announced User ID's with its roll-out of Universal Analytics earlier this year.

User ID report

This is a big deal. User ID functionality will provide the ability to tie together how consumers interact with brands across devices and answer questions like "Do my products sell more frequently on smartphones or desktops?"" Which devices are used primarily for research?" The User ID can also be associated with authentication systems, providing the ability to create custom segments based on attributes specific to your organization. The User ID provides for the ability to have a more complete picture of a customer's online journey, allowing you to promote and optimize your site more effectively.

Data Import

If you haven't already migrated to Universal Analytics, another reason to do so is Data Import. By leveraging either a customer ID or transaction ID, you can upload corresponding data directly into Google Analytics. This could include information such as:

  • age
  • gender
  • customer lifetime value (total purchases)
  • # of transactions
  • loyalty card holder
  • and much more!

Data Import

With these added dimensions, you can discover new trends among your customers. Just remember not to upload Personally Identifiable Information (PII). Uploading PII is against Google's terms and conditions, and in any case is a best practice to protect you and your customers data and personal information.

Whether you're investigating how customers move through your site, where they fall off before making a purchase, how they're interacting across devices, or want to include additional information to your reporting, Google Analytics provides solutions to answer all of these questions. We've only scratched the surface of the capabilities of Flow Reporting, Enhanced Ecommerce, User IDs and Data Import, but the case is clear; Google Analytics paints a dynamic picture of how your customers are behaving online.

image 
Behavioral Flow reporting
Enhanced E-Commerce
User ID report
Data Import
Viewing all 87 articles
Browse latest View live