Quantcast
Channel: Online Behavior - Guide
Viewing all 87 articles
Browse latest View live

The Ultimate Guide to Google Analytics Profile Filters

$
0
0
Google Analytics Filters

Segmentation is the key to greater understanding of your Web Analytics data. If you only look at overall bounce rate, e-commerce conversion rates, or any other metric alone, you miss out a lot!

Google Analytics offers 3 main ways to segment your data: custom variables, advanced segments and profile filters (there are many other ways of segmenting the data, like secondary dimensions or other filter types that can be applied to data, but they are less powerful). As the title already suggests, this article aims for uncovering the exciting world of profile filters. But below is a very quick comparison between the three most important ways to segment the data:

Profile filters belong to a long term segmentation strategy. The data collected in a specific profile in the past cannot be changed or removed, so please be careful when applying filters.

Advanced segments on the other hand, allow you to apply and remove segments without removing data. They are considered less effective for long term segmentation purposes.

Custom Variables can be used to define additional segments to apply to your visitors other than the ones already provided by Analytics. It is a powerful technique as it can be tailored to a website and its idiosyncrasies but, on the downside, it depends on additional website coding (here is a detailed explanation).

Important Facts to Know About Profile Filters

Probably the most important thing to keep in mind: the effects of applying Google Analytics profile filters cannot be undone. That's why I always advise to keep a raw data profile within your Google Analytics account. In case something goes wrong, you always have a backup profile. In addition, a new feature, the Google Analytics Change History, will help you monitor changes made by any user to your profile filters.

If you are unsure about how a profile filter works, it is a good idea to test it in a secondary profile first. As Daniel Waisberg wrote on this article about implementation: "The best way to learn how filters affect your Google Analytics data would be to have two profiles with the exact same settings (the real profile, and the test profile) and apply a new filter only to the test profile. Once it is applied, you can check the data and compare the number to learn if anything went wrong. Here is an article from the Google Analytics Help Center on how to add profiles."

Since last year, profile filters are also applied to Google Analytics real time reports; this is a significant help when testing new filters. So when you create a new profile filter you can watch the results in real time and, if you made a mistake, correct it in real time.

Two Types of Profile Filters

Over the last couple of years I am amazed by how creative people are when applying profile filters. In general there are two different types of filters: predefined and custom profile filters.

Predefined vs. Custom Filters

A. Predefined filter types

  • Exclude/Include only traffic from a specific ISP domain
  • Exclude/Include only traffic from a specific IP address
  • Exclude/Include only traffic from a specific hostname
  • Exclude/Include only traffic from a specific subdirectory

B. Custom filter types

  • Exclude/Include only traffic based on a specific dimension: related to content and traffic, campaign or ad group, e-commerce, audience/users, location, event, application or custom field.
  • Lowercase/Uppercase: converts the contents of the field to all lowercase or uppercase characters; only affects letters.
  • Search & Replace: search for a pattern in a specific field and replace with an alternate form.
  • Advanced: build a new field from one or two other fields; advanced filters further explained.

By now you have developed a solid understanding on the possibilities of profile filters. It's time to put things in practice!

10 Useful Google Analytics Profile Filters

I have selected 10 filters that, in my opinion, are very useful for a wide range of web businesses. So I encourage you to use those if they fit your specific situation. This is by no means a complete list; unfortunately, no person on earth can offer you a complete list. At least this will give you a clue about what is possible.

1. Include Your IP Address

It can be very useful to create a profile with an include filter on your IP address. Actually, in all of the Google Analytics accounts I have access to, this filter is present. For me this is a great way to test goals, filters and even a complete implementation.

You have to take into account that if you work at a large company there is a huge chance that others are on the same IP address as yours. The filter looks as follows:

Include Only Traffic from IP Address

2. Exclude IP Addresses

The exact opposite of the include IP filter is a great one as well. Excluding visits from your own company or any known third party is very important when setting up your profiles. These visitors generate a huge amount of pageviews and their behaviour is totally different from the "normal" website visitor for which you optimize the site experience.

Think about a large company where everyone sets the company homepage as the starting page in their browser. This can easily impact key metrics like conversion rate, bounce rate and several more metrics. At the end this may lead to bad online marketing decisions. Jusy apply the filter that is shown below to overcome this:

Exclude IP Addresses

If you need to filter out a range of IP addresses, this IP address range tool may be useful.

One important thing to understand is that adding two include IP address filters to the same profile doesn't work. In case you need to include more than one IP address, you should use regular expressions to set it up as one filter.

Here is a free cartoon eBook (link to PDF) explaining the most useful Google Analytics regular expressions out there to help your further on this topic (by Lunametrics).

3. Include/Exclude Specific Campaign

There are many reasons to use filters on specific campaign attributes. Let's assume you are running a large cpc campaign and an agency you work with should not have access to this information. You have to use this filter to exclude the cpc data from your profile:

Exclude cpc visitors

In the same way you can apply filters to the campaign source, content, term and campaign name.

4. Lowercase on Campaign Attributes

The larger the company you are working at or dealing with, the more people are probably involved in the campaign tagging process.

First of all it is very important that there is a document and there are strict guidelines on how one person should name a specific campaign. Very often I come across a Google Analytics account with 50 campaign media. A closer look reveals they actually have 10 or less campaign media, but the naming process went totally wrong.

One thing you can do to overcome a part of this problem, is to simply add five lowercase filters on the UTM campaign parameters:

  • Campaign Medium
  • Campaign Source
  • Campaign Content
  • Campaign Term
  • Campaign Name

How to create a lowercase filter on campaign medium is shown below:

Lowercase on Campaign Medium

From now on it doesn't matter whether the campaign is tagged as "cpc", "CPC" or "Cpc". In all cases the campaign medium is registered as "cpc" in Google Analytics. By adding these filters your data becomes more clean and easier to analyze and derive insights from.

5. Lowercase on Request URI

Quite often a website can be reached in more than one way. What I actually mean here is that the URLs can be written both with lowercase and uppercase characters without the webserver performing a redirect.

Take as an example the following two pages of a website: /about-Us/ and /about-us/. The URLs almost look the same and will direct you to the same content, but they will cause two different pageviews to be created in Google Analytics. With the following filter Google Analytics will record both pages as /about-us/:

Lowercase on Request URI

If you have a technical background or are surrounded by technical people that are willing to help, you can solve it in a different way. Unfortunately this is not always the case.

6. Attach Hostname to Request URI

If you have a multidomain implementation of Google Analytics running and collect the data of both domains in one profile, you cannot distinguish the same page names (Request URIs) in Google Analytics in an easy way (drilling down in a report or adding a secondary dimension on hostname will help).

An example: siteA.com/index.php and siteB.com/index.php. In your "All Pages" report in Google Analytics both pages would be registered as /index.php. So in order to distinguish between them you need to rewrite the Request URI and attach the hostname. The filter to accomplish this is shown here:

Attach Hostname to Request URI

A disadvantage of using this filter is that it breaks the links to your website in the content report, i.e. you won't be able to click on a link that leads from Google Analytics to your website. But since for an implementation with multiple domains you would have to chose only one domain to click on those links, this feature would be less valuable anyway.

7. Include Specific Region(s)

Do you run an internationally appealing website and are you interested in filtering out specific regions? Google Analytics profile filters make this a very easy task:

Include Only Traffic from Three Countries

In this case I use a regular expression to include only traffic from The Netherlands, Belgium and Germany. As you might guess, I live in one of these three countries ;-)

8. Include Only Mobile Visitors

We both know that there is a strong growing trend in the percentage of people using mobile to browse the web. A lot of companies are not yet ready for this and have relatively low conversion rates in this segment.

Use this filter if you want to take a closer look at the mobile visitors segment:

Include Only Mobile Visitors

9. Include Only Traffic to Specific Subdirectory

Let's assume you work at a company that includes a blog on a website section, e.g. /blog. You have hired three content writers that contribute to your website by adding posts to the blog directory.

There are many reasons why you would limit their access to the blog directory. The filter you need to handle this situation is given below:

Include Only Traffic to Blog

10. Include Only Traffic to Specific Hostname

Two reasons why you might place an include filter on your hostname:

  • You are prevented from someone hijacking your Google Analytics profile number and placing it on other domains
  • You can filter out a staging / test domain where the same Google Analytics profile number is running

Here is an example of how you can include only traffic to a specific domain, Exampledomain.com:

Include hostname filter

BONUS! 11. Exclude All Query Parameters

I like to be surprised and to surprise others :-) That's why I am throwing in one more useful Google Analytics filter.

If you are running a website with a lot of technical query parameters I strongly suggest to filter them out from your data in Google Analytics.

This way you can reduce the actual number of pages that show up in Google Analytics enormously and give the data more meaning. If you don't remove these query parameters the same page will show up too many times. Note: if you are in this situation you need canonical link tags for SEO purposes.

For example: siteA.com/order.aspx?id=100012 and siteA.com/order.aspx?id=100013 are exactly the same pages and you should measure them as one: siteA.com/order.aspx (without the query parameter). Here you can review the filter:

Exclude All Query Parameters

Important to note that this filter eliminates all query parameters in the profile where the filter is applied to. If you have just a few query parameters that you like to eliminate, you can add the query parameters to you profile settings as shown below:

Exclude Specific Query Parameters

Assigning a Filter Order

By default profile filters are processed in the order in which they were added. However, you can easily modify the filter order from the profile settings page in your administrator dashboard.

This is very important since the filter process order is influenced by a combination of two or three filters or one filter influencing the next one. Here you can see where to change the filter order:

Assign Filter Order

Final Thoughts and Tips

Google Analytics profile filters can be very powerful if applied in the correct way. 10 last things to keep in mind:

  1. Always keep an unfiltered profile (for safety reasons)
  2. Always keep a profile with an include filter on only your IP address (for testing purposes)
  3. Profile filters belong to a long term segmentation strategy
  4. Advanced segments are useful for ad hoc segmentation purposes and can set certain limits (more than, less than, equal to) where filters can't
  5. A good strategy is to first setup an advanced segment and when returning to it often create a separate profile for this specific segment
  6. Apply new filters to a secondary profile filter first before adding them to your main profile
  7. Profile filters help you to give limited access to specific stakeholders
  8. Filters are useful to any type of website (lead generation, e-commerce, service, blogs etc.)
  9. New profile filters don't work directly; in my experience it can take up to 1 or 2 hours before they actually do their job
  10. Take into account the order in which filters are processed

If you have read the whole post, well done! You are on your way to becoming a Google Analytics profile filter Ninja! :-)

What's your experience with profile filters? Any great filters to share? If you like the article, we very much appreciate a comment or share!

Related Content

  1. Understand Google Analytics Advanced Segments [video]
  2. Google Analytics Custom Variables: Segmentation Power
  3. The Choice Is Stark: Segment Or Die!
image 
Predefined vs. Custom Filters
Include Only Traffic from IP Address
Exclude IP Addresses
Exclude cpc visitors
Lowercase on Campaign Medium
Lowercase on Request URI
Attach Hostname to Request URI
Include Only Traffic from Three Countries
Include Only Mobile Visitors
Include Only Traffic to Blog
Include hostname filter
Exclude All Query Parameters
Exclude Specific Query Parameters
Assign Filter Order

Google Universal Analytics For Offline Behavior

$
0
0
Google Universal Analytics

I have been a very assiduous visitor to the Weizmann Institute of Science in the last four years as my wife studied her PhD. Besides going through the Institute during week days on my way to work, I also visit the various green areas and playgrounds with my kids during the weekends.

One of the rules that always impressed me in the Institute is that you can't go in or out without either passing an ID card in a turnstile or driving a registered car. This is a very strict policy, which is similar to a website that requires a login for opening it.

During all my visits to the Institute I never really thought about it using my Web Analyst hat, but this changed last week. While passing my ID card to visit the Institute, it suddenly came to my mind a demo of Universal Analytics by Nick Mihailovski, Senior Developer Programs Engineer in the Google Analytics team, where he showed how it can be used to collect offline data for things like swiping an ID card.

So I thought: "the Weizmann Institute could do some pretty interesting analyses if they used Google Analytics to track offline behavior..." This article shows how I would approach such an implementation. I focus more on the Google Analytics account design, rather than the codes necessary to make it happen; I will provide links to relevant code explanations whenever necessary.

But before that, let's go over a quick overview of what exactly is Universal Analytics and why is it different from the Google Analytics everyone is acquainted with.

What Is Universal Analytics?

Universal Analytics (UA) is the new generation when it comes to measurement. It is a new way of collecting data with Google Analytics that enables a myriad of capabilities using this platform. Here are some of the highlights:

  • User level data: probably the most significant change from the standard Google Analytics to Universal is that it will no longer aggregate data on a visit level, but on a visitor level. While it is still forbidden to send Personal Identifiable Information to Google Analytics servers, this will enable tracking user level behavior using a unique identifier.
  • Multi environment tracking: with the Measurement Protocol, Google introduced a multitude of additional collection capabilities, allowing its users to collect and send incoming data from any digital device to Google Analytics. This means that we can track more than just websites now, which will be the focus of this article.
  • Custom dimensions & custom metrics: custom dimensions and custom metrics are default dimensions and metrics that can be created to your own specific needs and using information you collect on the website (e.g. information provided by visitors on a registration form).
  • Simplified configuration: Universal Analytics provides more capabilities to change your data collection from within the admin interface, changes that were previously accessible only through the development environment.

There is more to Universal Analytics than the highlights above, you can find a great overview of them in Justin Cutroni's post, he provides some interesting explanations and examples. And if you are wondering how to go about migrating your current Google Analytics account from the standard code (ga.js) to Universal (analytics.js), here is another useful resource by Justin. Hint: you should do that using the excellent Google Tag Manager.

Another useful resource to learn about Universal Analytics on a more strategic level is Feras Alhlou's article, where he discusses the extremely valuable new approach of user-centric analytics; this approach can tell us a lot about our visitors. But, it can go beyond that to tell us more in-depth information about their visits and their habits. It essentially gives us an overview of their behavior, which in turn, provides us with a better understanding of our customers.

Google Analytics Account Design - Translating Offline Actions

So, let's go back to our example, monitoring the traffic/behavior of people going in and out of a closed establishment. The emphasis on "closed" is important, as it enables a much more accurate tracking; the model proposed below would not be possible on a place where people can go in and out unidentified, like a regular store (you will understand why later).

When you set sail for a Google Analytics implementation to track offline behavior, you will be boarding an adventurous ship!

Defining the implementation objective

The first step, as always, should be to define the objective of such an implementation. Here are the three main insights I would like to learn if I worked at the Weizmann Institute:

  1. Flow Visualization: which of the 10+ gates people use the most, and do they come in by car or walking? Do they leave through the same gate or through another gate? This information can be used to plan opening hours of specific gates and whether some gates should be shut down or created.
  2. Rush Hours: which days of the week and time of the day the Institute has the highest number of people inside? This information can be used to plan maintenance works and events.
  3. Facilities usage: there are different people coming in (e.g. students, subscribers to the pool, walkers), when each "segment" visit the Institute? This information can be used to understand the flow of different people through the campus and make changes to streets or pavements.

Visitors, Visits, Devices, Pages, Custom Dimensions

With the goals in mind we can start to define which Google Analytics fields we will use to collect the necessary data. After some thinking, here is how I would like to collect the data:

  • Visitor: when a person gets an ID card (which must always be the first action), he or she will become a new visitor, and will be counted as a unique visitor in future visits to the Institute.
  • Visit: every time a visitor enters the Institute they start a new visit, which will last until the person leaves it.
  • Device: since we don't have a browser, we consider a device to be the transportation mode, i.e. a car, a motorcycle or legs.
  • Custom Dimensions: in order to segment the visitors and learn more about their behavior, we will create a series of custom dimensions (visitor level) for each visitor:
    • ID: ID number
    • Type: Masters_Student, PhD_Student, Professor, Employee, Spouse...
    • Department: Physics, Biology, Chemistry...
    • Building: the name of the building.
  • Pages: every time a person passes his/her ID card either in a turnstile or drives a car into the Institute we will send a page with the gate name, something in the lines of '/walk/main-gate/'. At nights and weekends, a card is also necessary to enter a building, so we will also send a pageview to gather more accurate flow navigation information. Note: I decided to include the transportation mode in the page name as it is an intrinsic characteristic of the gate.
  • Events: we should use events for less "important" actions, things like swiping the card in a photocopier or other in-building activity.

This is basically what I would send to Google Analytics. The resulting information wouldn't be as rich as in a website, but it would be actionable and organized.

The important is to collect the data you need to help you make decisions, not to collect as much data as possible.

Implementing Google Analytics Offline With The Measurement Protocol

But how do you go about implementing such a thing? That's not simple, and it will require quite a bit of development. If you work for the Weizmann Institute, the specifications are half done, but if not (which I guess is probably true), you have to first define the objectives and try to map your offline actions into Google Analytics fields (as I did above).

Then, check the Measurement Protocol Developer Guide, it provides and overview and some examples on how to send hits to Google Analytics using the Measurement Protocol.

Then, visit the Measurement Protocol Reference, which describes to where and how to send data to Google Analytics; it also describes the required values when you send data and the supported data types.

Closing Thoughts

In summary, Universal Analytics opens up so many new possibilities! As you can see above, it can be used to anything we can imagine.

But the example above is an exception, very few establishments require identification, and usually you would like to track online and offline behavior. Things like:

  • How successful was an incentive to visit a brick-and-mortar store on the website?
  • How did people that ended up buying a product offline using a coupon got to my website?
  • How many of my AdWords leads ended up closing a deal on a call center?

The list could go on and on, so I hope you enjoy the Analytics journey, it will never end!

Guide To Google Analytics User Permissions

$
0
0
Google Analytics User Permissions

Cardinal Path articleThis month, Google pushed an update to Google Analytics Premium users that changed the face of account administration (it will also be available for the standard version soon). These changes have made user access management more granular and flexible than it had been before. In this article, we'll look at how you can wrap your head around the new user permissions options and offer suggestions on how to make them work for you.

Gone are the old and generic 'Administrator' and 'User' roles that could only be applied at certain levels. Previously, Administrator permissions could only be bestowed at the 'web property' level, while User permissions were granted on a profile-by-profile basis. Furthermore, the ability to manage user access and the authority to grant or revoke administrative permissions was wrapped up with the ability to create profiles and manage goals or filters. If you wanted individual Google Analytics users under your account to be able to set up new profiles, goals, or filters, you would also have to entrust them with the ability to delete the whole account - a sticky situation any way you look at it.

In the new account management scheme, there are three different types of permissions that can be given to users: Manage Users, Edit, and View. These permissions can be applied at any level: account-wide, a single web property, or just a single profile.

For instance, if you wanted to give a new employee a sandbox where they can figure out how to use Google Analytics (yes, even goals, filters, and user management features), you could create a profile, then give them all three types of permissions at the profile level only. This way, they have full access to do whatever they want with that profile, but no access to the rest of the Google Analytics account.

In another case, you might have a human resources or information technology czar in your organization who demands the ability to regulate all user access to all accounts, but has no interest in (or clue about) Google Analytics or the data that's in the account. To satisfy their no doubt very useful need to secure all the doors and windows, you can give them the Manage Users permission for the entire account, and leave it at that. That way, they can't muck about with profiles, web properties, and all that other crazy stuff, but they can control what really matters to them: which brains can see and manipulate which sets of data.

One drawback I've seen in the new permissions setup is that it's somewhat difficult to know which level you're giving up permissions to. Here's a bit of an explanation.

Google Analytics accounts

Log into Google Analytics as usual, click 'Admin', then click on whichever GA account you want to manage. When you arrive at the web property list for that account, you're at the Account level. Any changes to user permissions you make in the User tab are applied at the Account level. This means that a user given Edit access will be able to edit all of the web properties and profiles beneath (or are 'children' of) the account you're editing the settings for.

Google Analytics Web Properties

If you select a Web Property from the Account list, you will have descended to the Web Property level. Any changes to user permissions you make in the User tab here are applied at the Web Property level. A user given Edit access will be able to edit this web property and the profiles under each, but will not be able to edit neighbouring Web Properties or the Google Analytics Account as a whole.

Google Analytics Profiles

Finally, if you choose an individual Profile from the Web Property list, you will find yourself at the lowest level at which Google Analytics accounts are divided - the Profile level. Any user permissions changes here apply only to the individual Profile you are currently in. A user given Edit access at this level will only be able to edit this specific profile, but not neighboring profiles.

Profile Access

To navigate between the levels, it's much easier to use the breadcrumb navigation shown here:

Google Analytics breadcrumb

To change user settings for an individual existing user on your chosen account level, select the 'Users' tab and click on the dropdown menu next to the e-mail address of the user you want to change permissions for.

Google Analytics profile permissions

Then, select the drop-down menu under 'Profile Permissions', and select or deselect the checkboxes corresponding to the permissions you want or don't want that user to have.

Editing profile permissions

You can also set permissions when adding a new user to your Google Analytics account - make sure you're on the right 'level' (Account, Web Property, or Profile) before you start adding a user. Note: a good practice is to also make sure that the user you're adding is notified by e-mail that they've been given access by checking the box 'Notify this user by email'.

Google Analytics notify this user by email

While a number of further user experience and interface design tweaks can and should still be made, the increased granularity in user management functionality is certainly a welcome development for Google Analytics users concerned with account governance issues.

image 
Google Analytics accounts
Google Analytics Web Properties
Google Analytics Profiles
Profile Access
Google Analytics breadcrumb
Google Analytics profile permissions
Editing profile permissions
Google Analytics notify this user by email

51 Tips To Succeed With Web Analytics

$
0
0
Tips To Succeed With Web Analytics

Investments in Web Analytics can pay off big, but the degree of success depends on many factors. Simply said you need the right set of people, measurement tools and a clearly defined process to succeed.

It is impossible to provide you with all the advice you need in one article. In my experience, it comes down to consuming and sharing as much relevant information as you can. This, combined with many years of practical learning experiences in this magnificent field.

One tip beforehand: invest a few bucks and build a website yourself. Write about something you love. This will greatly enhance your learning curve! It is not limited to improving your Web Analytics skills, but touches on all the other online marketing disciplines as well.

Many different Web Analytics frameworks are out there; for the purpose of this article, I use the following 'five phases' framework:

Web Analytics Success Framework

Sit down and relax, here we go!

Phase 1: Pre-Implementation

You can't just start with implementing web analytics tags on your website and think great success is lying ahead of you. A lot of things need to be in place first. I will give you 10 tips to take into account in this first phase.

Tip 1: Define clear objectives for your specific web business first; without a clear set of online business objectives you are doomed to fail.

Tip 2: Set the scope of the project; does it involve one domain or 50 different domains?

Tip 3: Build a roadmap; what and when do you want to achieve something? Place milestones when needed.

Tip 4: Identify the stakeholders in the process; usually a lot of people need to get involved. Work on a clear overview of all the stakeholders.

Tip 5: Divide the responsibilities; who is responsible for what and when? It is crucially important to get this right.

Tip 6: Define your goals, KPI's, segments and targets right from the start; remember, this is an important, evolutionary process. It is never perfect, but you need to know where to focus on.

Tip 7: Define specific reporting requirements; don't become a reporting squirrel, but know right from the start what kind of data people in your organization are looking for.

Tip 8: Make wise budget decisions; spend enough money on people and not only on tools. Great, expensive tools won't do if there is no one there to make the data actionable. Start with free or cheap tools first.

Tip 9: Define micro and macro conversions; you may have one major conversion on your website, but there are more conversions tied to your business goals.

Tip 10: Decide together on the KPI's; your KPI's need to be widely agreed upon.

Let's continue with the second phase: the implementation.

Phase 2: Implementation

The technical phase is probably not the sexiest one for you, but you don't want to pass this one too quickly. These tips will guide you through the implementation phase.

Tip 11: Always reserve extra time; things never go exactly as planned.

Tip 12: Take into account the release planning; know whether tags can be placed within a few days or a few months.

Tip 13: Reserve a small budget for implementation testing tools; especially on larger sites tooling can be very useful.

Tip 14: Don't just handover a document, stay close to the implementation traject; IT and marketing need to work closely together.

Tip 15: Tag all your pages; you can't measure what you don't tag. "Is it OK if I only tag this part of my website?" It won't be the first time someone asks this question.

Tip 16: Customers are more important than tags; tagging is crucial, but make sure it doesn't negatively impact user experiences on your website.

Tip 17: Don't forget rich media experiences (Flash, Flex, RSS, Videos etc.); measuring in-page interactions is increasingly important.

Tip 18: Setup a testing environment; test in a different environment whether the tags are working or not.

Tip 19: Always triple-test implemented codes; it's better to take one extra day for testing than to go live with a bug.

Tip 20: Schedule maintenance periods; tags can easily disappear from your pages. Make sure you monitor this automatically or schedule periodic manual checks.

Phase 3: Configuration

By now you have clearly identified what you want to measure and all the tags are in place. Now it's time to setup your Web Analytics package on the admin side. What is important to consider? Phase three and four are mainly focused around Google Analytics, since it is the most widely used tool.

Tip 21: Limit the number of administrators to a minimum; setup your Google Analytics permissions in the right way.

Tip 22: Setup a master profile with raw, unfiltered data; in case something goes wrong you always have a backup profile.

Tip 23: Use Google Analytics profile filters for long term segmentation purposes; it helps you to optimize user experiences and conversions for different segments.

Tip 24: Build advanced segments for ad hoc data analysis; it helps you to uncover great segments that need to be targeted in a unique way.

Tip 25: Setup goals and funnels; essential to optimize your traffic sources, campaigns, keywords etc. on a certain outcome.

Tip 26: Setup goal values for non-transactional conversions; this will help to get a complete picture of the value per visitor for a specific segment.

Tip 27:Tag your Marketing campaigns carefully; ignore this phase and your data becomes meaningless or even worse, you make the wrong decisions.

Tip 28: Integrate a set of KPIs in a custom report; combine acquisition, behavior and outcome metrics.

Tip 29: Connect Google Analytics to external tools; this is very useful to derive insights more easily. For example, think about connecting Google Analytics to Next Analytics to automate reporting and free your time for analysis and optimization.

Tip 30: Setup intelligence alerts; uncover hidden correlations and causalities.

Phase 4: Analysis

Great, you have everything in place and two months of data are right there. It's time to earn some money. Start doing great analysis on your data!

Tip 31: Start every analysis with a question; know what you want to solve or improve first.

Tip 32: Don't focus on aggregates, always segment the data; one solution doesn't fit the need of all your visitors.

Tip 33: Don't focus on averages, always look at distributions; what if the average customer satisfaction is rated with a seven, but 25% of your customers are highly unsatisfied?

Tip 34: If you have time for only one analysis focus on the "All Traffic Sources" report; you will immediately get a clear overview on how your website and channels are performing.

Tip 35: Enhance quantitative analysis with qualitative analysis; solve for the 'what', the 'why' and the 'how'. The 'what' is not telling the complete story.

Tip 36:Visualize your Web Analytics data; it may help you to get your message across.

Tip 37: Use custom variables for multi-session analysis; how does a brochure download impact my hotel booking rate?

Tip 38: Know the difference between goal conversions and transactions; goal conversions can only happen once in a visit, transactions can happen multiple times.

Tip 39: Don't focus on raw, absolute numbers; put your data in perspective.

Tip 40: Accept that data is never perfect; the truth lies in your back-office.

Phase 5: Testing and Optimization

You have identified a few major issues on your website and some of your landing pages have a very low conversion rate. You might wonder, how can I improve? Read on and apply the following 10 tips to your specific situation.

Tip 41: Don't think you know better than your website visitors; the best optimization specialist in the world never beats your visitors.

Tip 42: Setup a wordclass testing team; you may need developers, designers, analysts and usability consultants to succeed.

Tip 43: Identify the most crucial pages to start testing with; optimize highly trafficked landing pages and funnel pages first.

Tip 44: Clearly define your test, hypothesis and goals; what are you testing, what is the expected outcome and what needs to be improved? Answer those questions first.

Tip 45: Spend healthy budgets on conversion optimization as compared to acquisition; driving lots of untargeted traffic doesn't make sense.

Tip 46: Select a testing idea on expected effect, duration and available resources; this will help you to get the highest ROI on your testing efforts.

Tip 47: Add the HIPPO's opinion as one of the testing variations; beat the Highest Paid Person's Opinion with numbers.

Tip 48: Apply the Conversion Trinity rules to your landing pages; think about relevance, value and the right call to action.

Tip 49: Use the right set of tools; automate what you can automate.

Tip 50: Set your confidence interval limit at around 95%; don't choose a winner too early!

Closing Thoughts

One more, my first and last tip: start a website and play in the real world! Apply these tips in your daily activities and I am 100% sure you will grow in the Web Analytics field.

Any great tips or experiences to share? We are happy to publish them. If you like the article, we very much appreciate a comment or share!

image 
Online Behavior

Paid Search Bid Optimization With Google Analytics

$
0
0
Paid Search Bid Optimization With Google Analytics

A little over a year ago Daniel invited me to contribute to the Adwords chapter of his eBook on Google Analytics Integrations. I prepared a lot of content at that time, not all of which made the final edit :-)

The following blog post is one area that can be considered addendum to that chapter; if you are interested on Adwords and GA, you can find a copy of the PPC chapter here. (*Disclaimer: I do not receive compensation for sales of the eBook, though Daniel did take me out for a cup of coffee).

For now, I am going to jump right in and share some of my processes and approaches to optimizing paid search campaigns using Google Analytics.

Bid Optimization Based On Ad Position

The location of an ad in the Search Engine Results Page (SERP) greatly impacts its CTR. The following is not a formal CTR study, but I do have access to a small pool of Adwords accounts which I believe provides a good statistical sample.

I ran a quick report (screenshot below) measuring CTRs for over 37 million impressions across 33 different Adwords accounts spanning multiple industries (excluding branded keywords). The report itself is a "keywords report" run in Adwords where I make sure to add the "Top vs. Other" segment as well as some sort of segmentation by time. The resulting report will be quite large in many cases, so you want to put it into a pivot table and then graph the results.

AdWords Keywords CTR report

For searches on google.com, CTRs were more than 20 times higher, on average, for ads that showed above the organic results compared to ads that showed to the right of or below organic results. Again, while this is not a scientific study per se, the sample size is large enough to support my main point: there is a huge difference in CTR when ads are above the organic results as opposed to ads shown elsewhere on the SERP.

CTR by keyword position

The calculated metric you see on the right of the screenshot above, % of Top Impressions, is what interests me the most. Since CTR is correlated with ad position (especially for Top vs. Right Hand Side), I want to look for changes in position that would be impacting my overall share of traffic. This can be done on the Campaign, Ad Group, or Keyword level by using a pivot chart to graph the percentage of impressions above the SERPs compared to the right hand side (RHS).

Recently, Google has released keyword level impression share data. This is a big development as advertisers can now see their true Percent Share of Paid Search down to the keyword level, and relate their share of search to their click through rates in order to understand fully how much traffic they are driving relative to the available traffic.

Of course, driving the highest share of traffic possible to one's site is not the goal of paid search (at least for most advertisers). With that in mind, I created the following custom report to determine which keywords would be the most appropriate to bid more aggressively on the “winners” (click on the image for a larger version).

Google Analytics Bid Optimization Custom Reports

The GA report above is for a business to consumer ecommerce store. First, I sorted the Ad Groups in a particular campaign by ROI (descending) and then added an advanced filter to only include Ad Groups that had more than 10,000 impressions in the given date range (to weed out Ad Groups with high calculated ROI but low volume). For this analysis, my goal is to find keywords that perform strongly (as measured by ROI), but don't garner as many clicks as they could due to low bidding. The Ad Group highlighted in row 6 has a fairly low CTR, gets a decent number of impressions, and good conversion metrics (click on the image for a larger version).

Positions report in Adgroup

When drilling down into the Ad Group, we find that almost all of the impressions are on the Right Hand Side (RHS). The CTRs for these keywords are 16Xs higher when ads display above the Organic results. Since this Ad Group is profitable, we have room to increase the bid to drive more traffic and thereby more gross revenue. It is worth noting that maximizing gross revenue and maximizing profits are far from the same thing.

For a great tutorial on how to adjust keyword bids in Adwords to maximize profitability, see this video.

Closing Thoughts

In short, when increasing your bids, as long your incremental cost per click is greater than your return per click, net profit will continue to rise. (You will need to keep you profit margins in mind too, of course).

I also want to point out that by having your Adwords account integrated with Google Analytics you will be able to automatically see "Ad Slot" as a dimension in your reports. This is very important, because while the initial analysis of percent of search share provided insights into how ads may be performing in a competitive landscape, Google Analytics will provide full visibility into how ads in different slots perform on your site. This is key as it will directly impact your bidding practices.

In the report above I set Product Page Views, Add to Shopping Cart actions, and E-commerce Transactions as goals that I use of the report. As a result, there is a "horizontal funnel" where I can see the total flow of the user through the shopping cart / checkout process AND segment that by where on the Google SERPs they clicked my ad.

In general, I find the Ad Slot dimension in Google Analytics to be much more useful than the Average Position metric reported in Adwords. I am interested in hearing your experiences with this sort of analysis. If you haven't tried this before, please give it a go and share your insights in the comments below.

image 
AdWords Keywords CTR report
CTR by keyword position
Google Analytics Bid Optimization Custom Reports
Online Behavior
Positions report in Adgroup

Google Analytics Dashboards: A Step-By-Step Guide

$
0
0
Google Analytics Dashboards

Dashboards are used across the world by most people (and organizations), either in the form of a car dashboard or in a business context. Every marketing professional talks about dashboards and why he or she needs one; but what exactly is a dashboard?

In an article named Dashboard Confusion Revisited (link to pdf), Stephen Few discusses the misunderstandings around dashboards and clarifies the subject using the following definition:

A dashboard is a visual display of the most important information needed to achieve one or more objectives; consolidated and arranged on a single screen so the information can be monitored at a glance.

With that in mind, I will discuss the options available inside Google Analytics that will enable analysts, marketers and website owners to build the most effective dashboard to monitor their objectives.

I will focus on the tool capabilities and try to describe use cases for certain widget types, chart types and other functionalities. It is up to you to define your goals and the metrics you will need to use in order to monitor them.

One important comment before we dig deeper... I will write quite often about Metrics and Dimensions, so if you are not acquainted with them, check this help center article.

Finding & Creating Your Dashboards

When you log into Google Analytics you are automatically directed to your Audience Overview report. Right below the orange navigation bar, on the left navigation, you will see a Dashboards tab where you can see your existing dashboards or create new dashboards.

There are many great dashboards ready for you to import (see examples at the official solutions gallery), and I warmly recommend you to try those solutions. However, I believe an understanding of how dashboard creation works will help you customize them for your own needs.

So let's start! When you click the dashboard tab shown above you will have an option to + New Dashboard. Click on it:

Creating Google Analytics Dashboards

Google Analytics went the extra mile to create a "Starter Dashboard" for people that might not enjoy the freedom of the blank page (to paraphrase Irvine Welsh). But I assume you do like the blank page :-) So go for the blank canvas.

Dashboard Widget Types And Their Strengths

The dashboards on Google Analytics are composed of widgets, components on the interface that enable you to see a specific chart. There are currently 6 types of dashboard widgets, and 4 of them can be used both for standard and real-time data. Below is a description of each.

Dashboard Widgets

Note: You can create up to 20 dashboards, and each dashboard can contain up to 12 widgets.

1. Metric Widget

Metrics WidgetStarting with the standard metric widget (i.e. not the real time data)... this widget presents a single metric, no charts. It is ideal if you need to check one piece of information every day. For example, you are responsible for an online store and want to know quickly what was the store revenue in a specific time range; or you want to know how many times your mobile app crashed during the same period. Here you go, that's one widget you should add to your dashboard, but make sure it contains a really important number!

In the screenshot in the left you can see an example. Note that the title of the widget (which is customizable) is "USA Revenues". In addition to choosing a widget title and a metric (in this case revenue), you have two additional customizations:

  • Filter this data: this enables you to filter the data using any available dimension. So, for example, if you want to see only USA data, you will use dimension "Country / Territory" and include only United States. One nice touch is that you will still see the unfiltered metric, which gives you context. See the grey line just below the metric telling you which percentage of the total revenue is the United States revenue.
  • Link to Report or URL: if you would like to have the title of the widget link to a specific report you would have to add it to this field. In our example, I would link the widget to the Product Performance report, so if I want to drill down into which products brought revenue I would be able to see that with one click. Just search the report you want to link to and grab the URL to paste here. Please note that when you create a widget from within a report (by clicking on "Add to Dashboard" in the top-left of any report) this field will be automatically populated by Google to link to that specific report.

The counter widget is the real time version of the metric widget, it shows the number of active visitors currently browsing the website. It can be grouped by different dimensions, which means you will see a bar below the counter showing the distribution of the active visitors into, for example, medium or new and returning visitors or mobile and web visitors. It will certainly make the dashboard live and cool :-)

2. Timeline Widget

Timeline WidgetThat's the widget I personally use the most since it shows a trend; usually I am looking for a chart that tells me a story, and I think the timeline tells me what is going on in the last x days. Especially as I can have two lines shown in it, usually one related to quantity and the other quality.

As you can see in the screenshot in the left, I used two metrics in the chart: daily visits and revenue. The first shows me the amount of people I brought to the website, which is good to know but doesn't really pay my bills; the second tells me how much money I brought during the same days. So, as you see, there was a day during that period that I brought a very small additional people but a lot of additional revenue. That's great, now it is time to leave the dashboard and go see what I have done that day and repeat it every week!

The Timeline widget can also be used with real time data, and it will show the real time activity either for the last 30 minutes or for the last 60 seconds of activity.

As with the Metric widget, you can also filter the data and add a link to it.

3. GEO Map Widget

The GEO Map Widget is a very handy tool if you are distributed around the globe. It allows you to see how a specific country is contributing to your website in a glance. You can see, for example, what is the ecommerce conversion rate in a specific country as compared to the whole continent (or the world); and more interestingly, you can see how this rate changed in two different time ranges (like in the screenshot below)

GEO Map Widget

The GEO Map widget can also be used with real time data, and it will show the real time activity either for countries or for cities. Again, you can group those visitors by regions as well.

As with the Metric widget, we can also filter the data and add a link to it.

4. Table Widget

Table WidgetTable widgets are great for monitoring landing pages, content, product and PPC campaign performance. You will see a list of the dimension chosen along with two metrics of your choice. So, for example, you can look at your mediums and how well they are performing. In the screenshot in the left it is crystal clear that the email marketing manager should get a promotion!

The real time version of the table widget is an interesting one, though it is complex to use. You will be able to choose up to three different dimensions and the widget will present a pivot table showing the combinations between these dimensions in real time. (Give it a try and share your thoughts on the comments below.)

In both version you will be able to show the table with 5 to 10 rows of data, and you will also be able to filter the data and link the report to a URL.

5. Pie Widget

Google Analytics Pie ChartsI am not a big fan of pie charts in general, I think they are often misleading. If you want to know more about my general opinions on data visualization, take a look at Nuts and Bolts of Chart & Graph Types.

Pie charts can be used to visualize data such as percentages out of a whole. Their main advantage is that they are widely used in businesses contexts, so professionals might feel more comfortable using them. However, sometimes it is hard to compare different sections of a given pie chart. As we will see in the next section, they can be replaced by the bar chart.

In Google Analytics, you will be able to choose one metric and one dimension to it. So, as you can see in the example in the left, you will be able to show revenue by medium. You can choose to either use the regular pie chart or to use a donut chart, which has a nice touch to include the total metric being shown. You can also decide on showing from 2 to 6 slices on the chart, and you can filter the data and add a link to it.

6. Bar Widget

Bar ChartThe bar widget is probably the most powerful of all widgets, it offers a multitude of customizations options. As you can see in the screenshot on the left, we are able to create charts where we show a metric (in this case revenue) group it by a certain dimension (in this case medium) and pivot it by a second dimension (in this case sub continent).

Going further into this example, we can see that while the blue bar (Northern America revenue) is slowing decreasing among different mediums, the green bar (Australasia revenue) disappears in the affiliate and paid mediums, meaning there is no revenue coming from Australasia on affiliate and paid campaigns. On the contrary, we can see that the red bar (Northern Europe revenue) is almost non-existent for organic, "(none)" and email and grows for affiliate and paid traffic. The point is: this chart tells a story.

Additional customizations that you will be able to apply using this widget:

  • Show up to x bars: you can decide to show from to 2 to 9 bars in the chart.
  • Use a horizontal version of this chart.
  • Stack series elements such as pivoting, segmentation or date comparison: I personally think this creates a confusing chart, I prefer the style shown in the screenshot where the pivot is shown in different bars instead of stacking them up.
  • Show values of the vertical axis.
  • Show values of the horizontal axis.
  • Show title of the vertical axis.
  • Show title of the horizontal axis.
  • Show up to x gridlines: you can either decide to show from 2 to 4 gridlines, not to show any, or to let Google Analytics decide (Auto).
  • Filter this data: the data on the widget can be shown for any particular set of data, e.g. you can see the widget only for PPC traffic.
  • Link to report or URL: you can link the chart to a specific report.

Dashboard Functionalities

I reviewed above all the widget types that can be used to create charts to be placed in your dashboards. Now, I will go over its functionalities, i.e. what can be done on top of it to monitor your data, share, export and customize the dashboards. These options will be seen right below the dashboard name, above the widget section.

1. Advanced Segments / Unified Segments

Very often you will create a dashboard to monitor your objectives but not one that can be used to monitor each segment of your visitors. For example, if you see that there is a strange drop in the conversion rates on a particular day, you might quickly apply your segments to see where this drop come from, than you could move on to a more detailed analysis using the standard reports.

For this purpose you can use Segments to isolate and analyze specific parts of your traffic. You can either click "+ Create New Segment" to build a new one or click the name of an existing segment to apply it to the dashboard. Learn more about the Segment Builder in this help center article.

2. Share

Sharing Google Analytics dashboardsSharing is a critical functionality and you should be using it! If you create an awesome dashboard you should share it with your teammates so that they can take advantage of it too; and maybe your manager should have access to it so that he can see how well you are doing your job :-) You will see that there are currently two ways of sharing a dashboard (to see them click on the share button just below the dashboard title):

  • Share a Dashboard: the dashboard will be available to all other users in that profile / view.
  • Share a Template Link: this will generate a URL you can send to other users.

If you have a great dashboard please share the link in the comments!

3. Email

Often times, you will need to send the dashboard as a PDF file instead of sharing it. This might be the case if the person that needs it is less comfortable with Google Analytics or if you would prefer to send it to yourself as a reminder every day, week, month or quarter. In this case you would use the email functionality. You can configure the following for your email:

  • Destination: enter any emails that should receive the dashboard.
  • Subject: the subject of the email.
  • Frequency: can be once, daily, weekly, monthly, quarterly.
  • Day of the week/month: for weekly and monthly emails you will be able to choose the day of the week/month that the dashboard will be sent to you.
  • Advanced Options - Active: allows you to have an "expiration date ", i.e. you can set this email just for the period of x months and then it will stop automatically.
  • Content: you can write any content to be added to the email with the file.

4. Export

This option enables you to download a PDF version of the Dashboard into your computer instantly.

5. Customize Dashboard

I love this! It allows you to decide the layout of the dashboard. So if you want some of the graphs to be larger than others you can use different layouts. Below are the options available to you.

Google Analytics dashboard layouts

Closing Thoughts

Dashboards are a great way to keep track of your website. It can save a lot of time on your (and your colleagues) day-to-day so it is certainly worth to invest time to have the right data in there.

And here is a wise a advice for making the most of Marketing dashboards:

"The number one thing in making your dashboards great is to make sure you know its purpose and that you are very focused. Your dashboards should be business changing and action inducing."

image 
Dashboard Navigation
Creating Google Analytics Dashboards
Dashboard Widgets
Metrics Widget
Counter Widget
Timeline Widget
GEO Map Widget
Table Widget
Google Analytics Pie Charts
Bar Chart
Sharing Google Analytics dashboards
Google Analytics dashboard layouts
Table Widget

Universal Analytics Form & The Measurement Protocol

$
0
0
Universal Analytics Form & The Measurement Protocol

A few months ago the Google Analytics team launched the Measurement Protocol, a powerful way for developers to send requests to Google Analytics from anywhere. This brings measurement to the next level, from websites and Apps to any customer touch point.

In this article I present the Universal Analytics Form, a solution I developed with my colleague Eduardo Cereto Carvalho, Web Analytics Specialist at Google. The form is an easy-to-use solution to implement the Measurement Protocol to upgrade your current tracking capabilities. Basically, we will use a Google Form and an App Script in order to send requests to Google Analytics using the Measurement Protocol. Sounds cool, doesn’t it? I think it does!

Use Cases

Here are a few cases where you might want to use this solution.

Integrating Offline Transactions into Google Analytics

Suppose you distribute coupons for store purchase in your website; and suppose you make some investment in online advertising. If that’s the case, you should be really eager to understand how much of your ad spending brought in terms of store revenue.

Using this solution, your store cashiers will be able to use a Google Form to update the coupon number and other information on Google Analytics. This way you will be able to link the offline spending to acquisition channels (and other info) for any customer. It is extremely important to keep in mind that Google Analytics Terms of Service strictly forbids to add personal identifiable information (PII) to your data collection, so make sure not to add information like name, surname, username, social security number, etc (see section 7 of link above).

While it is not very difficult to add a coupon ID to the form and get information about the purchase (as you will see in the video below), it might be a bit more challenging to have an ID that can link the offline to online information. The approach I would recommend is to use the Client ID command to retrieve a user Client ID and add it to the coupon. This way, the cashier will be able to fill it in the form too and the cycle will be closed.

Integrating Call Center into Google Analytics

Integrating Call Center data to your analytics data would be somewhat similar to the approach described above. However, it would require an additional step: during a call to complete a transaction, the call center will need an ID number in order to link the transaction to online activity. This can be done by asking the visitor to click on a button on the site that gets the Client ID on Google Analytics (see link above) and shows a popup with that number.

Integrating anything into Google Analytics

Logging expenses details into Google Analytics may also be an useful (and interesting) activity. Using this method, you can use a form to log all transactions you make during the day into Google Analytics. Then you can analyze your personal and/or business finances using Google Analytics powerful reporting and analysis UI. Remember: no PII.

Implementations Details

Below is a video explaining the method proposed in this article. In order to make the explanation clearer we used an example of a website that advertises online and direct visitors to a website where a discount coupon is offered (for purchases in a brick-and-mortar store).

Here are the steps required in order to implement the Universal Analytics solution (as explained in the video):

  1. Think profoundly about what information you need to collect in order to measure your goals. Read through the Measurement Protocol parameter reference to learn more about all the options available to you.
  2. Create a Google form and add one question to collect each data point that you will need. This will define which data will be collected and where it will be displayed on your Google Analytics reports.
  3. Copy the Script below and paste it into your Script Editor on Google Forms (find it on the "Tools" menu). Click Save.
  4. Edit the Script to match your data collection details (read the comments we left on the Script below for you). Click Save.
  5. Add a trigger for the Script to run every time a form is submitted. On the Script page, click on Resources and then on All your triggers. Click on No triggers set up. Click here to add one now and then click on save for the default trigger. You will need to provide permissions for the App, click OK.
  6. Set up Custom Dimensions on Google Analytics interface to work with the dimensions you added (if you decide to use Custom Dimensions). It might be helpful for you to read this article explaining what are custom dimensions.
  7. Make sure the relevant people fill the form every time data becomes available.
  8. Login to Google Analytics and have fun.

The Script

Below is the Script that you will need to copy to your Form. We tried to add as many comments as possible to guide you, but feel free to ask questions in the comments below if you have any.

var GA_TRACKING_ID = 'UA-xxxxxxxx-y';

// maps each form field to a field in GA
var data_mapping = {
  0: 'cid',   // User ID
  1: 'tr',    // Transaction Revenue
  2: 'ta'     // Transaction Affiliation
}

function trackForm(e) {
  var data = [],
      item,
      res = e.response.getItemResponses();
 
 
  for (var i=0; i< res.length; i++){
    item = res[i].getItem();
    if(data_mapping[item.getIndex()]) {
      data.push([
        data_mapping[item.getIndex()],
        res[i].getResponse()
      ]);
    }
  }

 
  data.push(

    ['tid', GA_TRACKING_ID],   // Uses the ID you provided in the beginning of the Script.

    ['v'  , '1'],   // The protocol version.

//  ['cid', Math.floor(Math.random()*10E7)],   // Remove the backslashes starting this line if you don't have a UID, this will set a random value.

    ['t'  , 'transaction'],   // The Hit type, must be one of the following: 'pageview', 'appview', 'event', 'transaction', 'item', 'social', 'exception', 'timing'.

    ['ti', Math.floor(Math.random()*10E7)],   // Assigns a randon value to Transaction ID.

    ['z'  , Math.floor(Math.random()*10E7)]   // Cache Buster.

  );
 
  var payload = data.map(function(el){return el.join('=')}).join('&');
 
  var options =
   {
     'contentType': 'application/json',
     'method' : 'post',
     'payload' : payload
   };
 
  UrlFetchApp.fetch('http://www.google-analytics.com/collect', options);
}

Happy Analyzing! And let us know if you have any comments, questions or suggestions. Also, we would love to know how you are using the Form, leave a comment below.

Related Content

  1. Google Universal Analytics For Offline Behavior
  2. Google Universal Analytics: A User-Centric Approach
  3. Collect & Process Data Wisely [cartoon]

Google Analytics Custom Dimensions - 5 Questions

$
0
0
Google Analytics Custom Dimensions

Cardinal Path articleCreating audience segments is one of the most important things you can do with a web analytics tool. When your segmentation rules do a good job of highlighting the different motivations of your site visitors, the behavior of each segment can help you make your site more relevant to each group, and thus increase the likelihood that your site will deliver on your goals.

Google Analytics already lets you segment on more than 130 default dimensions, so it can seem a little intimidating to consider adding even more breakouts with custom dimensions – additional attributes of a user, session, or action that you collect via your tracking code.

And the opportunity gets bigger with Universal Analytics: if you use that newer version of the platform underlying Google Analytics, you can now track up to 20 custom dimensions (or 200 if you're using Google Analytics Premium with Universal Analytics). So even if you've mastered the concept of Custom Variables in "classic" Google Analytics, when you plan a migration to Universal Analytics, you suddenly have to consider four times as many custom fields!

But don't get paralyzed by considering every fancy data point you could track. Ultimately, you want custom dimensions to give you new perspectives on which visitors and elements are driving results on your site. With that in mind, take a step back and consider these 5 questions.

1. Which fields can you use to join web analytics data with offline data?

A big dream of the Universal Analytics platform is to unite online behavioral data with other data sources in your organization, and setting up custom dimensions related to form fields is a great way to make those connections. Most CRM records begin with web form data, so collecting the same (non-personally-identifiable) form fields in your CRM and your web analytics tool can provide a lot of shared data points to help join up your data.

For example, higher education admissions departments commonly analyze offline data to predict a given applicant's likelihood to enroll or graduate based on attributes like previous education, program of interest, and years of work experience.

Since those data points are also often collected in inquiry forms, those schools could collect those attributes as custom dimensions in Google Analytics, then break out their standard acquisition, behavior, and conversion reports by previous education, program of interest, and experience to compare the browsing history of the most promising visitors. Does the same traffic source attract the likely superstars and the less-promising candidates in equal amounts? And how much research does each type of prospect do before submitting an application? Without using custom dimensions to connect applicant data with browsing behavior, you may never know.

And don't forget about the low-hanging fruit: ZIP codes are commonly collected on forms and are easily actionable (by matching to offline spending, for example). But they aren't collected by default in Google Analytics, so acquire it via a custom dimension if you’re using a form.

2. Which content attributes can't be implied from URLs and page titles?

Logical URL structures and title tags aren't just good for user experience – their appearance in Google Analytics can also tell you a lot about user intention. For a basic example, people who visit a URL containing "phones" are probably interested in phones (to paraphrase a famous ad by Google). Since you could create an advanced segment of "phone interested" users who ever viewed a URL with phones, you might not want to use up one of your precious custom dimensions to capture that user interest in a redundant field. However, if an important content attribute is on a page but not in your URL, a hit-level custom dimension can be really valuable.

For instance, the URL for an e-commerce product page about a particular phone would probably contain the word "phone" as well as the brand and product ID, but it might not identify whether the phone is available for purchase online only, offline only, or both. This factor could clearly affect user actions like adding the item to an online shopping cart or looking up offline store locations, but product availability may change often and without warning, and it can be a nightmare to match parsed page data with a promotional history database.

As long as the product availability information is contained in a consistent page element that you could access with JavaScript, custom dimensions are the way to go here. Configure your Google Analytics tracking code to set that value for the "product availability" custom dimension, then send that data along with the pageview hit to Google Analytics.

3. How will the values of each custom dimension actually appear in reports?

You can't re-process the values of a custom dimension if they've been collected in an unexpected or messy way, so before you deploy any tracking code for your custom dimensions, you should obsessively specify and sanity-check the potential values that the code will return.

  • If you're collecting hard-coded values, be sure that conventions for spelling and delimiters are what you want to see in reports. (Basic, but still overlooked all the time).
  • If you're collecting data from open-text form fields, be sure that those fields are only being sent to Google Analytics once: after validation and on form submit.
  • Always be confident that the variable won't collect personally identifiable information about a user.

4. Which other data points should be connected to the latest value of each custom dimension?

Whenever you send a new value for a particular custom dimension to Google Analytics, the new value replaces the previous value, but "scope" determines how many other data points in a visitor's history are associated with that value. Sometimes it's a straightforward decision, as in these examples:

  • If visitors submit their year of birth on a form, it's logical to continue to associate that value with all of their activity going forward, across all events, pageviews, and visits, at the "user" level.
  • The weather in a particular geographic area is likely to be consistent from click to click, but not necessarily from visit to visit. If you're tracking the weather in a visitor's city (as in this amazing implementation from Elisa DBI), it makes sense to set that dimension's scope to the "session" level.
  • A price range should only be associated with hits to the relevant product pages, so that dimension should be scoped to the "hit" level.

Sometimes, the relevant scope isn't as clear. For example, logged-in status could be worth collecting with three different custom dimensions, one for each scope:

  • User scope: at a high level, knowing whether a visitor has ever logged in (not necessarily whether they were logged in during a particular visit) indicates that visitor's depth of engagement with the site, beyond visit count.
  • Session scope: can be used to calculate how often users are logging in.
  • Hit scope: can be used to diagnose how the logged-in status affects granular actions during a visit. For example, did the users see personalized elements when they added to cart? Did they see gated content when they exited the site?

5. Will this custom dimension actually drive any changes on your site?

Don't collect data just because you can. Begin with the end in mind, and collect data that will help you take action. Google gives you so many dimensions to choose from, and so many opportunities for customizing your own that you really have to stay focused on what makes sense to track.

Bottom line: Always ask yourself how tracking a particular custom dimension will make your analysis easier and make your site better.


Google Analytics Demographics & Interests Reports

$
0
0
Google Analytics Demographics & Interests Reports

Google Analytics (GA) has always been a great tool when it comes to understanding and optimizing online behavior. But with the addition of Demographics and Interests information as a first class citizen in the tool it brought a new level of insights into it.

This feature consists of a series of reports where we can see behavior information relating to visitor age, gender and interests; but even more importantly, this data can also be used to segment standard reports and create remarketing lists. [Please note that this feature is still not available to all users.]

Below I will discuss how to enable the reports, use the new dimensions to understand customer behavior, and optimize your website experience based on it. In the last section I will also go over some technicalities on how the data is collected and its accuracy.

Setting Up The Demographics & Interests Reports

In order to get the demographics and interest data into your GA account you will need to perform the following steps:

  1. Update your Analytics tracking code and Privacy Policy to support Display Advertising (instructions).
  2. Enable Demographics reports in the reporting interface: go to "Audience">"Demographics Overview" and you will find an "enable" button in there.
  3. Enable Demographics and Interest Reports in the Admin interface: click on Admin (top-right orange navigation), then on "Property Settings", then on the checkbox below the Demographics and Interest section

Please note that if you use Google Tag Manager, you should select "Add Display Advertiser Support" in your Google Analytics tag template; and if you are using a 3rd party tag management tool Google Analytics might not be able to validate your code, but you should be able to skip validation and the reports will work.

Once you perform the steps above it might still take a few days until you can see data populating your reports.

Age, Gender And Interests - Standard Reports

Google Analytics Demographic Report

In the image above we see one of the standard demographics reports: Age. There are 4 standard reports included in this new capability: Age, Gender, Affinity and Other Categories. As with most reports, you can choose which metrics group you want to use for your analysis (Site Usage, Goal Set 1, Goal Set 2... Ecommerce), which visualization you want to use in the tables (pies, bars, comparison, and pivot) and pick additional secondary dimensions. Here is a quick summary of each report:

  1. Age: break down of user by age.
  2. Gender: break down of user by gender.
  3. Affinity Categories: categorizes visitors taking into account their life styles (technophiles, music lovers, gamers, etc). See below for information on how Google finds this information.
  4. Other Categories: categorizes visitors based on the specific content they consume, along with how recently and frequently they consume that content.

Just looking at the reports above will be mind blowing, believe me. Suddenly you will be able to learn so much about who visits your site and how they behave in there... just wait till you start doing analyses! Below is a quick analysis I recommend as a first taster, look how we can instantly learn which age groups are terribly failing! In fact, we can learn which age groups we are terribly failing to persuade!

Age Comparison on Google Analytics

Here is how you reach this report:

  1. Find your age report
  2. Choose the metric group you want to use above your chart (Site Usage, Goal Set 1, Goal Set 2... Ecommerce)
  3. On the top right corner of your table, choose the comparison icon
  4. Choose the metrics you want to use from the drop down in the top right corner of your table.

In a few words, all our analyses (both using the standard reports above and the advanced techniques below) will try to uncover the two most important segments in our website: high revenue but low visits (high potential to bring more valuable visitors) and low revenue but high visits (high potential to allocate budget away from them).

Three Optimization Techniques Based On Age, Gender And Interests

Once you get used to the standard reports and overcome the initial euphoria of getting to know your customers so much better, you should roll up your sleeves and start analyzing.

1. Use Demographics And Interests For Customer Segmentation

The first step will undoubtedly be to create one (or several!) Segments. While the standard reports allow you to understand how different ages (or gender, affinity...) are performing, using a Segment will enable you to merge this data with other interesting dimensions such as campaigns, country or content.

To start, create a segment by clicking on the down arrow in any report. This arrow is shown just below the title of reports (see orange arrow below). Once you click on the arrow, click on "+ Create New Segment" and you will see the following screen.

Demographic Segmentation

You will be able to create segments using the demographic information we saw above. It is important to perform a few analyses as the one showed in the screenshot above (comparison chart), it will provide a good guidance on which segments to build. Here are some segment ideas for you to try:

  1. Male and Female: you should create both a male-only and a female-only segment. Once you do, apply both of them and visit your campaign report. Click on the conversions tab (Ecommerce or Goals). Now you will be able to see which campaigns have a good conversion rate for males and for females. This can provide important insights into ways to optimize your landing pages for each audience. Also take a look at which countries are performing well for each gender, there might be some cultural forces in play.
  2. Age Groups: looking at the comparison chart screenshot above we clearly see that the website is doing extremely well for visitors between 25 and 44 years old. Why is that? Maybe your display campaigns are being shown on "young websites"? Or maybe your content is too "cool" for older people? Or maybe the images are just not right? By creating a segment containing 44+ visitors you can understand better their abandonment points and where they are coming from.
  3. Affinity Categories: first go to the Affinity Categories standard report and check which of them have a high conversion rate. Then create a segment containing these categories, and visit the content report (direct link to report). This will show you which website content is the most important to your most valuable customers and it will help you prioritize content creation.

2. Use Demographics And Interests Data For Smart Remarketing

Since the data mentioned above is also available on Google Analytics' remarketing feature, you will be able to use the insights discovered above when creating your lists.

So, for example, if you discover that the Music Lovers category for ages 25-34 is underperforming significantly, you could create a remarketing list for those people; using this list you can create a campaign to reach out to them in other websites in the Google Distributed Network with a special "Musical" offer. Here is a guide to create Remarketing lists.

3. Use Demographics And Interests To Analyze A/B Tests

One of the most actionable features in Google Analytics is Content Experiments, as it enables marketers to actually experiment with their websites to create better experiences. And one of its strengths over other tools is that it allows us to use data already collected by Google Analytics to analyze website testing results; this is done using the segmentation tool mentioned above.

Let's imagine a simple example. You are A/B Testing a page to see which creative works better: a family image, a couple image or a baby image. The overall test result shows you that the couple image would increase overall conversion rates by 30%. However, using the Age segment described above, you might see that the results vary significantly among different age groups. And you might find out that the best creative for 18-24 is couples, 25-34 is babies and 35-44 is families. Wow!

Data Source And Accuracy

As mentioned above, in order to get access to this data, you will need to update your Analytics tracking code and Privacy Policy to support Display Advertising. The reason is that the new reports are derived from the DoubleClick third-party cookie. According to the Adwords Help Center, here is how this information is determined by Google:

When someone visits a website that has partnered with the Google Display Network, Google stores a number in their browsers (using a "cookie") to remember their visits. This number uniquely identifies a web browser on a specific computer, not a specific person. Browsers may be associated with a demographic category, such as gender or age range, based on the sites that were visited.

In addition, some sites might provide us with demographic information that people share on certain websites, such as social networking sites. We may also use demographics derived from Google profiles.

It is extremely important to know that this data is not available for every single user, so usually the reports will be based in a subset of users. In addition, some data in the reports may be removed when thresholds are applied to prevent inferring the identity of an individual user.You can learn more on how data is subject to thresholds on the reports.

Closing Thoughts

Most marketers have been in the dark when it comes to understanding their website visitors demographics. Up till now the options were extremely limited when it comes to merging online behavior with demographics for a specific website.

With Google Analytics Demographics & Interests reports we can now optimize website experiences based on our visitors in a much deeper way. How will you use it? Please let us know in the comments!

image 
Google Analytics Demographic Report
Age Comparison on Google Analytics
Demographic Segmentation

Powerful Custom Dimensions in Universal Analytics

$
0
0
Custom Dimensions in Universal Analytics

Every now and then, your business requirements demand some metric or dimension that is not available out-of-the box in your analytics solution. Luckily, this is becoming less common as measurement systems evolve. With product developments like Universal Analytics, we have seen huge improvements in areas such as user-level segmentation. New technologies enable us to track and analyze user behavior in unprecedented ways.

This has also resulted in a decreased need for custom code implementation. With the shift to server-side data handling in Universal Analytics comes the ability to perform user-level segmentation directly in the Google Analytics interface. In fact, the new segmentation UI is so flexible and powerful that, before implementing any new custom code, you should think hard on whether your business requirement can already be satisfied by creating segments based on existing data.

Example: While in the past you needed to set user-level custom dimensions (previously custom variables) to ensure metadata would persist across sessions, you can now construct multi-visit conditions using the segmentation UI, rendering many custom dimension use cases somewhat obsolete. You may leverage 'Date of First Visit' to create cohorts, and user-level 'Conditions' and 'Sequences' to create buckets of visitors based on interactions.

Capturing First Time and Repeat interactions

However, there are still some gaps to fill for those of us interested in pushing the product to its limits by implementing custom code (yay!). While 'Date of First Visit' will show us when users visited our website or launched our application for the first time, enabling us to create cohorts, there are several other "firsts" which could be equally interesting to us. Additionally, there are many use cases for tracking the total number of various interactions per user. In this post, I will focus on what we may call First time and Repeat dimensions, using a mobile app as an example.

  • First timers: dimensions showing the date a user interacted with something for the first time, e.g. the date of a first purchase. This will expand our analysis beyond "Date of First Visit" to include the first of virtually anything.
  • Repeats: dimensions showing how many times a user has interacted with something, e.g. the number of purchases per user. This allows us to create additional user-level buckets based on recurrences.

These dimensions have tons of use cases and will enable us to answer questions such as:

  • How many users purchased product Y for the first time in a specific month?
  • How are users who performed X interaction at month Z behaving over time?
  • When did user-bucket A interact with content B for the first time?

This in turn cascades into a great number of sub-questions which we will now be empowered to answer.

Before moving on to the actual code and set up, let's look at some example segments you will be able to create through this implementation (after all, we want to make sure this kind of analysis is useful to your business before digging into the necessary code and configuration).

Example Segments

By measuring first time interactions as well as the total number of interactions per user, we open up for some really powerful analysis. Imagine the granularity you can reach with these types of segments available to you.

Show conversion rate for a given time period for users who made their first purchase in May

Conversion Rate segment

Show users who have made between 5-9 purchases in total, and made their first purchase during the first week of July

Repeated purchase segment

Show users who interacted socially for the first time on 4th of July

Social interaction segment

Show users who have made more than 10 downloads

Repeated conversion segment

The list goes on, and before implementing you should ask yourself: what additional segments would benefit my business analysis? Could I construct such segments with these First time and Repeat dimensions? If the answer is yes, let's get to it.

Time and Repeat Dimensions: A Mobile App Implementation

In this post I will provide an example of how to implement this approach in an Android app using the Google Analytics Android SDK (v3), which is built on top of the Universal Analytics platform (the methodology will therefore follow that of the new wire format). From an app development perspective, this will not entail any overly complicated logic or difficult client-side storage. In contrast, it is powerful in its simplicity.

At the root of this implementation lies the use of custom dimensions. Namely, we want to set the following two user-level custom dimensions:

  1. First {Interaction} Date
  2. Number of {Interactions}

For those unfamiliar, custom dimensions are just like the default dimensions available in Google Analytics, except you create them yourself, giving you the option to collect additional data that is not provided automatically.

Let's use in-app purchases as an example. For each user, we want to capture the 'First Purchase Date' as well as 'Number of Purchases'. However, the implementation can easily be extended to any type of interaction in the app: an event being triggered, particular content being viewed, etc.

Part A: Interface Configuration

The first step in getting started with custom dimensions is to create them in the GA interface (step-by-step instructions are available here). In the Admin section, we specify the Name and Scope of our new custom dimensions. Both of these should be set to a user-level as we want them to persist across sessions and be associated with all hits for our users.

Set up custom dimensions

In Universal Analytics, Custom Variables have been replaced by Custom Dimensions. These are very similar in concept, with the main difference that Custom Dimensions are considered "first class" dimensions in reports. Without going into too much detail here, this means that you set up the Custom Dimension, in terms of Name and Scope, in the GA interface. In your code implementation, you will only need to specify the Value and the corresponding Index. This means less bytes in the requests to GA servers, and more flexibility for you to rename or change the Scope of your Custom Dimensions in the future.

Part B: Code Implementation

Let's move on to the code. Step-by-step, this is what we need to do:

  1. Get an instance of Google Analytics and Calendar
  2. When a user makes a purchase, determine whether it is the first time
  3. If yes, set custom dimension 1 and 2. If no, simply update custom dimension 2
  4. Update the user's total number of purchases client-side for future reference

Step 1: Get an instance of Google Analytics and Calendar

Before we can use any of its methods, we need to get an instance of Google Analytics and a reference to our Tracker object. The Tracker object is responsible for tracking a particular tracking ID (UA-XXXX-Y), telling Google Analytics which property to send data to. We also need to get an instance of the Calendar class, which we will use to pass date values to our custom dimensions.

For simplicity, we will do this in onCreate of a main entry point of our app. During initialization we could also set custom configurations for the Tracker, such as modifying the dispatch period, anonymize IP, etc. (alternatively, we could use EasyTracker for our configurations, or initialize Google Analytics in a subclass of Application, to create a singleton that can be used throughout our app).

Step 2: Is the User Making a Purchase For the First Time?

We need some way of determining whether a user making a purchase in our app is doing so for the first time. To that end, we can easily utilize SharedPreferences, a class which allows you to save and retrieve persistent key-value pairs of primitive data types. We want to save a number in SharedPreferences called "numPurchases". We will set this number after the first purchase, with the value of 1. This means that "numPurchases" will only be present if a user has already made a purchase. By checking whether "numPurchases" exists or not when a user makes a purchase, we can determine whether the user is making a purchase for the first time.

Step 3: Set Custom Dimensions

If the user is making an in-app purchase for the first time (i.e. "numPurchases" equals zero), we're going to set custom dimension 1, "First Purchase Date". We will use our Calendar class to pass the current date as the value of the custom dimension in the format day/month/year. Additionally, We will set our second custom dimension, "Number of Purchases", with the value of 1.

If it is not the first time a user makes an in-app purchase, we will instead increment "numPurchases" by 1 and pass that value to our second custom dimension, "Number of purchases".

In both cases, we're attaching the custom dimensions to an appview hit (to send the data for processing, custom dimensions always need to be set prior to a tracking call). For an in-app purchase like this, you might instead want to use ecommerce measurement hits. For the purpose of this example, however, we're sticking with a simple appview. The screen name in the appview hit ("purchase") could be used as a Goal screen in Google Analytics.

Step 4: Update the number of purchases

Finally, we need to update our "numPurchases" number and save it in SharedPreferences. This way, the next time a user makes a purchase, "numPurchases" will reflect an updated value corresponding to that user's total number of purchases.

We're done! You may find the complete script below.

Complete Script: First Time and Repeat Dimensions In a Mobile App

public class ExampleActivity extends Activity {

    private Tracker gaTracker;
    private Integer mYear;
    private Integer mMonth;
    private Integer mDay;
    private SharedPreferences mPrefs;
    final String numPurchasesPref = "numPurchases";

    @Override
    public void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);

        // get instance of GoogleAnalytics and a reference to Tracker
        gaTracker = GoogleAnalytics.getInstance(this).getTracker("UA-XXXXX-Y");  // replace XXXXX-Y with your own tracker ID

        // get instance of Calendar and relevant fields
        final Calendar c = Calendar.getInstance();
        mYear = c.get(Calendar.YEAR);
        mMonth = c.get(Calendar.MONTH);
        mDay = c.get(Calendar.DAY_OF_MONTH);
    }

    public void onPurchase(View v) {
        mPrefs = PreferenceManager.getDefaultSharedPreferences(this);
        int numPurchases = mPrefs.getInt(numPurchasesPref, 0);

        if (numPurchases == 0) {
        gaTracker.send(MapBuilder
        .createAppView()
        .set(Fields.SCREEN_NAME, "purchase")
        .set(Fields.customDimension(1), String.valueOf(mDay + "/" + (mMonth +1 ) + "/" + mYear))
        .set(Fields.customDimension(2), "1")
        .build()
        );
        } else {
        gaTracker.send(MapBuilder
        .createAppView()
        .set(Fields.SCREEN_NAME, "purchase")
        .set(Fields.customDimension(2), String.valueOf(numPurchases + 1))
        .build()
        );
        }

        // update and save numPurchases
        int numPurchasesUpdate = numPurchases + 1;
        SharedPreferences.Editor editor = mPrefs.edit();
        editor.putInt(numPurchasesPref, numPurchasesUpdate);
        editor.commit();
     }

}

What do you think? I would love to hear your comments as well as other use cases for this implementation.

Related Content

  1. Google Universal Analytics For Offline Behavior
  2. Google Analytics Custom Dimensions - 5 Questions
  3. Universal Analytics Form & The Measurement Protocol
image 
Conversion Rate segment
Repeated purchase segment
Social interaction segment
Repeated conversion segment
Set up custom dimensions

Mobile App Analytics Insights

$
0
0
Mobile App Analytics

The mobile app was the first non-web platform to be built into the Google Analytics suite as a first class citizen; it was really a response to the shift towards smartphones and tablets we have seen in the last six years or so.

While in the past we had to hack together our own solutions to track mobile apps (and they would still end up looking like websites in reports), the new Google Analytics'Mobile App Analytics easily allow us to track user interaction within our apps, with app-centric data surfaced in reports. Gone is the web-centric approach: mobile app developers will get thrilled to find that reports now reflect data that make sense to them as a business.

App Analytics Insights

Data is great, but the purpose of Analytics is to get insights so that we can take action. In this post, I will cover some sample reports from the new app tracking interface and how they can help us make decisions for our business.

Launch: How Many People Are Using Our App?

So let's assume we have just released a new version of a 'Flight Search' app. We received that email notifying us that it has been approved on a marketplace (finally), and we're up and running. Or, perhaps, we have just launched some new marketing campaign targeted to specific regions to get people to install our app. It will be valuable for us to be able to monitor, in real time, our success in acquiring users, where they are coming from, and how they engage with our content.

This is where the Real Time report comes in handy. Within seconds of user interaction, data gets surfaced in the report, allowing us to take action quickly on the information available. We will be able to see the number of users by app version, the screens they are active on, as well as their geographical location (down to a city-level)

App Analytics Dashboard

Measure & Iterate: How Do We Optimize User Experience?

Ok, we are acquiring users and we are engaging with them in a variety of ways. It will now be essential for us to understand how users interact with and move through the app. We want to be able to answer questions such as:

  • Do we have any bottlenecks?
  • Where are users leaving the app?
  • How do users behave differently depending on operating system, language, etc.?

The Behavior Flow report will show us how users flow through the app: where they drop off, how they reach specific screens, in what order. For example, in the report for our 'Flight Search' app below, it seems we have some significant drop-off rates (the red) on some of our main screens.

We might want to slice and dice this data a bit to see if we can narrow down the problem to particular operating systems, app versions, user types, etc. This sort of analysis empowers us to reiterate and improve user experience.

Analytics Behavior Flow

Measure & Iterate: How Do We Prioritize Issues?

Our app is live on the app marketplace, but user reviews indicate that it has some technical issues, it seems to be crashing occasionally. To be realistic, bugs are an uncomfortable fact of life for developers, and we are probably always going to experience them to some extent. The question we seek to answer, then, is how do we prioritize? Where should we allocate our resources?

The Crashes and Exceptions report will give us some key data which will bring us closer to answering these questions. It will highlight information such as:

  • What is the exception?
  • What is the impact of the exception?
  • Where/when is it occurring within our app?
  • On what devices, operating systems, app versions?

Looking at the report below, it appears we fixed one bug when upgrading to 2.1, but introduced another. It will be important to segment this data to try to retrieve more information about the exceptions so that we can take appropriate action. For example, perhaps the bug only exists on Android, but not iOS. Consequently, our Android team has some work ahead of them. Our iOS team gets to go to the pub early.

App Analytics Exceptions Crashes

We distinguish between crashes and exceptions in reports. Exceptions are the superset of crashes, because not all exceptions are fatal. In our report above, however, we only have fatal exceptions.

Evolve: What OS/Features Should We Focus On?

As we look to develop our app further, new challenges arise, and we need to know how to allocate resources within our company. What operating systems should we focus on? What app features? In other words, where do we get the most bang for our buck?

Looking at the Devices and Network Overview, we are again able to extract some key information for our 'Flight Search' app. For example, most of our Android users are on 4.0 or higher. In fact, only a small percentage of our user base is on Android 2.3 or lower. This tells us that if we were to drop support for Android OS versions below 4.0, that would affect relatively few users.

On the up side, we could instead start using those 4.0+ features, and we wouldn't have to keep supporting 2.3, which could save us some headache. On the other hand, maybe we have so few users on Android versions below 4.0 because our app is not optimized for those users, which is something we might want to look into. It is important to approach this type of data from several angles to make the best decision for our business.

App Development Prioritization

Evolve: What Users Should We Focus On?

As our user base grows along with a richer dataset in Google Analytics, we will also be empowered to identify and take action on key segments. Behavior reports such as Session Duration and Loyalty below offer some really valuable insights.

It appears our conversion rate goes up with a longer session duration and more session instances. This tells us that we should focus on retention and engagement: we want our users to keep coming back to our app, and we need to address any early fall off points (which the Behavior Flow report above can help us with). The data indicates that repeat-users and users who engage with our app for longer periods of time have a higher chance of converting. This is essential information that cascades into both our marketing and design efforts.

App Analytics user segmentation

App Analytics segmentation

Reporting: How Do We Best Present Data?

So far we have discussed out-of-the box reporting, all populated by default by the Google Analytics SDK. However, it can be extremely useful to surface business-specific dimensions and metrics. That is, data that reflect our particular business. I have previously written about some custom dimensions use cases and how they can be leveraged to enhance reporting.

Custom dimensions are just like the default dimensions available in Google Analytics, except we can create them ourselves, giving us the option to collect additional data that is not provided automatically.

Not only do custom dimensions allow us to capture additional information about our users and content, they also enable us to create reports that will make sense internally within our company. People in our organization may be unfamiliar with the terminology and presentation of data in Google Analytics.

By creating a custom dimension representing 'Customer Tier', for example, we are able to produce segments and reports like the one below, highly relevant to our business analysis and shareable throughout our company.

App Analytics custom dimension

Similarly, custom metrics enable us to to send custom data to Google Analytics, but custom metrics take integer values instead of strings. For our 'Flight Search' app, we might want to increment custom metrics upon searches, reservations, and cancellations, etc. to be able to produce a custom report like the one below. Again, this is relevant for our particular business and will make sense internally, even for users unfamiliar with Google Analytics.

App Analytics custom metrics

Closing Thoughts

The Google Analytics Mobile Apps SDKs make the implementation of Google Analytics in our mobile apps easier than ever and opens up some really powerful analysis. By measuring and iterating on app usage we can continuously work towards a better user experience, which will ultimately benefit our businesses. Sometimes it can be as easy as identifying an obvious bottleneck to monetization, sometimes more careful analysis will be required to reach those valuable insights.

My aim with this post has been to give you an idea of how to get started with mobile app tracking analysis. I would love to hear how you are using, or plan to use, these reports yourselves!

image 
App Analytics Insights
App Analytics Dashboard
Analytics Behavior Flow
App Analytics Exceptions Crashes
App Development Prioritization
App Analytics segmentation
App Analytics user segmentation
App Analytics custom dimension
App Analytics custom metrics

A Guide to Google Tag Manager for Mobile Apps

$
0
0
Google Tag Manager for Mobile Apps

Google Tag Manager (GTM) for mobile apps was first announced in August this year and has some great implications for app developers.

Perhaps most notably, the product has the potential to overcome one of the most critical challenges in the business: pushing updates to the user base without having to publish a new version on the app marketplace. (if you are looking for information on Google Tag Manager for websites, please check out this guide.)

Typically, from the moment an app is shipped it is frozen, and from that point onwards the developer can only make changes to how the app behaves if the user accepts an update. By shipping an app with GTM implemented, configurations and values may be continuously updated by publishing new container versions through the web-based GTM interface.

In this post, we will cover how to get started with GTM for mobile apps and how to implement Universal Analytics tags using the GTM SDK for Android. As a heads up, this will occasionally get pretty technical, however I believe it is important to understand the product from its fundamentals.

Initial Set Up

Before we get started, some initial configuration steps need to be completed. More detailed instructions on these are available in the Google Developers Getting Started guide, but in a nutshell they include:

  • Downloading and adding the GTM library to our app project
  • Ensuring our app can access the internet and the network state
  • Adding a Default container to our app project

We will hold back on that last part, adding a Default container, until we have created some basic tags and are ready to publish. We will revisit the Default container later in this post.

Create an App Container

We need to start off by creating a new container in Google Tag Manager and select Mobile Apps as the type. Typically, we will have one container for each app we manage, where the container name is descriptive of the app itself (e.g. "Scrabble App"). Take note of the container ID on top of the interface (in the format "GTM-XXXX") as we will need it later in our implementation.

App container for mobile app

Opening a Container

Assuming we have completed the basic steps of adding the Google Tag Manager library to our project, the first thing we need to do before we start using its methods is to open our container.

Similarly to how we would load the GTM javascript on a webpage to access a container and its tags, in an app we need to open a container in some main app entry point before any tags can be executed or configuration values retrieved from GTM. Below is the easiest way of achieving this, as outlined on the Google Developers site:

ContainerOpener.openContainer(
        mTagManager,     // TagManager instance.
        “GTM-XXXX”,       // Tag Manager Container ID.
        OpenType.PREFER_NON_DEFAULT,   // Prefer not to get the default container, but stale is OK.
        null,                    // Timeout period. Default is 2000ms.
        new ContainerOpener.Notifier() {       // Called when container loads.
          @Override
          public void containerAvailable(Container container) {
            // Handle assignment in callback to avoid blocking main thread.
            mContainer = container;
          }
        }
    );

Before we talk about what this code does, let's hash out the different container types to avoid some confusion:

  • Container from network: Container with the most recent tags and configurations as currently published in the GTM interface
  • Saved container: Container saved locally on the device
  • Fresh vs. Stale container Saved container that is less vs. greater than 12 hours old
  • Default container: Container file with default configuration values manually added to the app project prior to shipping

We will talk more about the Default container later on. Back to the code. In this implementation, the ContainerOpener will return the first non-default container available. This means that we prefer to use a container from the network or a saved container, whichever is loaded first, because they are more likely to hold our most updated values. Even if the returned container is Stale it will be used, but an asynchronous network request is also made for a Fresh one. The timeout period, set as the default (2 seconds) above, specifies how long to wait before we abandon a request for a non-Default container and fall back on the Default container instead.

We may change the open type from PREFER_NON_DEFAULT to PREFER_FRESH, which means Google Tag Manager will try to retrieve a Fresh container either from the network or disk. The main difference is hence that a Stale container will not be used if we implement PREFER_FRESH unless no other container is available or the timeout period is exceeded. We may also adjust the timeout period for both PREFER_NON_DEFAULT and PREFER_FRESH, however we should think carefully about whether longer request times negatively affects the user experience before doing so.

Tag Example: Universal Analytics Tags

We have completed the initial set up and know how to access our Google Tag Manager container. Let's go through a simple example of how to track App Views (screens) within our app using Universal Analytics tags.

Step 1: Push Values to the DataLayer Map

The DataLayer map is used to communicate runtime information from the app to GTM, in which we can set up rules based on key-value pairs pushed into the DataLayer. Users of GTM for websites will recognize the terminology. In our example, we want to push an event whenever a screen becomes visible to a user (In Android, the onStart method is suitable for this). Let’s give this event the value 'screenVisible'. If we want to push several key-value pairs, we may utilize the mapOf() helper method as demonstrated below. In this case, since we will be tracking various screens, it makes sense to also push a value for the screen name.

public class ExampleActivity extends Activity {

  private static final String SCREEN_NAME = "example screen";
  private DataLayer mDataLayer;

  public void onStart() {
    super.onStart();
    mDataLayer = TagManager.getInstance(this).getDataLayer();
    mDataLayer.push(DataLayer.mapOf("event", "screenVisible",
                                                   "screenName", SCREEN_NAME));
  }
//..the rest of our activity code
}

We may then simply paste this code into every activity we want to track as a screen, replacing the SCREEN_NAME string value with the relevant name for each activity ("second screen", "third screen", etc.).

Note: the container must be open by the time we push values into the DataLayer or GTM will not be able to evaluate them.

Step 2: Set Up Macros In Google Tag Manager

Simply put, macros are the building blocks that tell GTM where to find certain types of information. Some macros come pre-defined in GTM, such as device language or screen resolution, but we may also create our own. First of all we want to create a Data Layer Variable macro called screenName: this is the name of the screen name value we pass along with the event as demonstrated above.

GTM will then be able to evaluate the screenName macro, which can consequently be used in our tags. If we have not done so already, we may also create a Constant String representing our Analytics property ID at this point. These macros are now at our disposal in all container tags.

Macros for Mobile Apps

Step 3: Configure an App View Tag

Let's set up our Universal Analytics App View tag. Our configurations are visible in the screenshot below (note the use of our newly created macros). The screen name field value of the App View will be automatically populated and corresponds to what we push to the DataLayer as the value of the screenName macro. The gaProperty macro value specifies which Google Analytics property data should be sent to (by reusing it throughout our container, for every Universal Analytics tag, we can both save time and prevent some critical typos).

Tag Manager app view tag

Step 4: Configure a Firing Rule For Our Tag

Finally, we need to set up the conditions under which the tag should execute. Since we are pushing an event with the value "screenVisible" every time an activity becomes visible, this should be the condition under which our tag should fire, as demonstrated below.

Tag Manager firing rule

Step 5: Save and Publish

We can continue to create other tags at this point. It may be beneficial, for example, to create some Google Analytics Event tags to fire on certain interactions within our app. We should apply the same logic in these instances: We need to push various event values to the DataLayer as interactions occur, and then repeat the steps above to create the appropriate Universal Analytics tags. When we're happy, all that's left to do is to create a new version of the container and Publish.

Tag Manager version

As we ship our app with Google Tag Manager implemented, requests will be made to the GTM system to retrieve our tags and configuration values as we discussed earlier.

Hold on, there was one more thing: the Default container!

Default Containers

When we are finished with our initial Google Tag Manager implementation and feel happy with the tags we have created, we are almost ready to ship our app. One question should remain with us at this point: what do we do if our users are not connected to the internet and hence unable to retrieve our tags and configurations from the network? Enter the Default container.

Let’s back up a little bit. In the GTM world, tag creation, configuration, settings, etc. is primarily handled in the web-based GTM interface. The power of this is obvious: we no longer need to rely on our development teams to push code for every change we want to make. Instead, we make changes in the GTM interface, publish them, and our tags and values are updated accordingly for our user base. This of course relies on the ability of our websites or applications to reach the GTM servers so that the updates can take effect. Here things get a bit more tricky for mobile apps, which partly live offline, than for websites.

To ensure that at least some container version is always available to our app, we may add a container file holding our configuration values to the project. This can be a .json file or a binary file, the latter being the required type to evaluate macros at runtime through GTM rules. We may access the binary file of our container through the GTM user interface by going to the Versions section. Here, we should download the binary file for our latest published container version and add it to our project.

create tag manager version

The binary file should be put in a /assets/tagmanager folder and its filename should correspond to our container ID (the file must be located in this folder, and it must be named correctly with our container ID). At this point, we should have both the JAR file and the binary file added to our project as shown below.

Mobile app tag manager files

Once this is done, we are ready to ship the app with our Google Tag Manager implementation. As described earlier, Fresh containers will be requested continuously by the library. This ensures that, as we create new versions of our container and publish them in the web-based GTM interface, our user base will be updated accordingly. As a back-up, without any access to a container from either the network or disk, we still have the Default container stored in a binary file to fall back on.

Summary

Let’s summarize what we have done:

  1. After completing some initial configuration steps, we created a new app container in the web-based GTM interface
  2. We figured out how to open our container as users launch our app, choosing the most suitable opening type and timeout value (taking into consideration user experience and performance)
  3. We then implemented code to push an event to the Data Layer as various screens become visible to our users, setting up a Universal Analytics App View tag in GTM to fire every time this happens
  4. We downloaded the binary file of our container and added it to our app project to be used as a Default container
  5. Lastly, we created and published our container in GTM

We are now ready to ship our application with GTM implemented!

Closing Thoughts

Google Tag Manager for mobile apps can be an incredibly powerful tool. This basic example shows how to implement Universal Analytics using this system but barely scratches the surface of what is possible with highly configurable apps that are no longer frozen. Simply put, getting started with GTM for mobile apps today sets businesses up for success in the future, I recommend trying it out as soon as possible.

I would love to hear your thoughts around Google Tag Manager for mobile apps. What are your plans for (or how are you currently) using it?

image 
App container for mobile app
Macros for Mobile Apps
Tag Manager app view tag
Tag Manager firing rule
Tag Manager version
create tag manager version
Mobile app tag manager files

Google Tag Manager: Coding & Naming Conventions

$
0
0
Google Tag Manager: Coding & Naming Conventions

Earlier this year eConsultancy reported that Tag Management System (TMS) adoption is set to reach 50% by 2017. Hindsight may reveal this estimate to be somewhat pessimistic if the growth of TMS adoption fails to abate.

Driving this trend, Google Tag Manager became a fundamental part of the ConversionWorks service offering. 2014 is set to become the year of the TMS and, as we hope, the year of Google Tag Manager (GTM). In this post I reference GTM but the content is equally applicable to any TMS.

But why is Google Tag Manager adoption rampant? There are many reasons, but the main one is that it's easy. This is both good and bad: good when done well and for the right reasons; bad when done carelessly. Based on my experience with GTM, this article will describe a couple of key points (non-exhaustive) that will help you get your GTM (or TMS of choice) right... or at least better!

This article will be divided in two main sections: Naming and Coding advices to be used in your Google Tag Manager settings/configuration.

Write To Be Read: Naming Conventions for Tag Manager

If you've used GTM for real, then you may have found that the number of tags, rules and macros can grow very quickly into a hard to manage mass of tracking stuff.

Therefore, you should build your tags, rules and macros to aid maintenance, helping you to manage your GTM implementation far easier. Easier for you (whatever your role), easier for developers, marketers and testers.

Easier means less risk, less time, and less money wasted on tagging. Yes, wasted. Tagging does not have a direct impact on your bottom line in the same way as optimisation does, so tag less (write less code) and move on quickly to something more important!

In the meantime, consider how tags, rules and macros are used on your pages:

Most General => Most Specific
Macros => Rules => Tags

Whilst tag re-use is recommended as much as possible you will find that your macros are used most generally of all with rules consuming macro output to fire tags in the most specific use cases.

For this reason, you should name your tags, rules and macros to help with organisation. Here are 4 helpful heuristics to guide your naming strategy.

1. Use existing conventions

If you have existing naming conventions within your team, use them. The familiar naming structure will help you find the tags, rules and macros you need. This extends to case and white-space usage. The naming semantics are addressed next.

2. Relate macros, rules and tags

As mentioned above, tags are the most specific GTM assets. You can use this property to group tags such that the rules and macros related to these tags are easily found. In the examples below, see how the use of the naming convention aids finding (and therefore managing) related tags, rules and macros.

a) Name tags by site function, tag type and tag function

The conventions used below is Functional area - Tag type - Tag function

Search by site function (homepage tags)

Search by site function (homepage tags)

Search by tag function (click tags)

Search by tag function (click tags)

Search by tag type (GTM tags)

Search by tag type (GTM tags)

Here are additional sample tags using this convention:

  • All pages - GA - page tracking
  • Basket - GA - vpv
  • Util - GTM - Click Listener
  • Product Quantity - GA - Click Event
  • Basket - Remarketing - Abandon Basket
  • Checkout - AdWords - Conversion
  • Checkout - GA - Transaction

b) Name rules by site function and rule function

The conventions used below is Functional area - rule function

Search by site function (rules specific to homepage tags)

Search by site function (rules specific to homepage tags)

Search by rule function (rules using the URL in conditions)

Search by rule function (rules using the URL in conditions)

c) Name macros by macro function

The default set of macros hints at sensible naming:

Google Tag Manager Macros

Continuing the theme when extending auto-event tracking macros clearly shows the meme in use here:

  • element title
  • element type
  • element alt

This is by no means intended to be an exhaustive list of tag, rule and macro naming scenarios - just an illustration. Feel free to mold, tweak and grow these conventions but make sure they are used rigorously. Employing GTM police, the effective assignment of roles and responsibilities, and manic adherence to process may be the subject of a future post...

3. Peer reviews

Consider this little adage when building your GTM implementation:

Tell me, and I will forget.
Show me, and I may remember.
Involve me, and I will understand.
(Confucius, BC 450)

Use your peers to review your work (and vice versa). Learn from each other. Make each other aware of changes, ideas, plans or issues and involve the team to build a Borg-Collective-like understanding of your GTM system.

4. Useful comments - not essays

If you find yourself in this situation you need to have a careful think about your commenting practices:

Useful code comments

Apply comments to container versions - use them to your advantage by relating container version numbers and names to sprints.

The Best Code is No Code: Coding Conventions for Tag Manager

It's an old adage that the best software has little or no code. Adding code adds complexity, risk, time and expense.

In GTM-speak, this means avoiding custom HTML tags and custom Javascript macros as much as possible. Whilst writing this post I took a question from a client:

"Should we calculate the value of a transaction including a discount and member bonus in a macro? It seems cool, quick and easy to do this in GTM...

Yes, it does seem cool, quick and easy, but it's very wrong. Don't embed business logic in the measurement layer that should rightly reside in the business logic layer of your app. Treat GTM as if it were (and it is!) an architectural layer in your software system. Ask the measurement layer to do measurement things - no data operations, no presentation operations and no business logic - measurement only.

I've seen custom HTML tags used to fix tracking issues caused by poor markup; this is also wrong. If you have a set of links on a page that require meta data for use in tracking pixels when clicked, decorate the links using helper methods in the presentation layer. Don't use a custom HTML script in GTM to scan your page DOM to do the link decoration; this is slow and fraught with risk. Every time you change the site you'll chase the change in GTM and more than double the required testing. It's expensive to throw good money after bad in this manner.

Don't patch your site using tracking tags!

Related Content

  1. Google Tag Manager: A Step-By-Step Guide
  2. A Guide to Google Tag Manager for Mobile Apps
  3. Google Tag Manager (GTM): What Do I Need To Know?
image 
Search by site function (homepage tags)
Search by tag function (click tags)
Search by tag type (GTM tags)
Search by site function (rules specific to homepage tags)
Search by rule function (rules using the URL in conditions)
Google Tag Manager Macros
Useful code comments

Visualizing Google Analytics Data With Fusion Tables

$
0
0
Visualizing Google Analytics Data With Fusion Tables

Have you ever secretly wished to do crazy visualizations with your Google Analytics data? I am sure you have! Well, there are several ways to do that, the most powerful being the Google Analytics API in conjunction with powerful visualization tools.

However, sometimes visualization tools may require technical knowledge or are just too expensive. That's why I thought about using Google Fusion Tables to provide a few complementary visualizations to Google Analytics - it is a great tool, very user friendly, and free.

In this article I provide a quick step-by-step guide to use Fusion Tables to visualize Google Analytics data: how to bring the data, prepare it, and visualize it using great charts. By the end of the article you will be able to create a visualization just like the one below, which includes data from Google Analytics and from a public dataset. Click on a dot on the map to see how cool it is! (I included icons to note where data was brought from Google Analytics and where it was brought from Wikipedia)

Bringing Data From Google Analytics To Fusion Tables

As of the writing of this article, there is no automatic way to export data from Google Analytics directly into Fusion Tables, so I had to use a rather manual process to do it. Luckily, Google Analytics exports to Google Spreadsheets; and from Fusion Tables it is simple to import from Spreadsheets, which makes the process easier.

Also keep in mind that as of the writing of this article, you can download data from the Google Analytics report which you are viewing, with the metrics that you are viewing. This means that if you are looking at the GEO Location standard report, you will download the countries along with the standard metrics: Visits, % New Visits, Bounce Rate... If you want, for example, to export E-Commerce data in addition to to the standard metrics, you will need to export two reports and then merge them using Fusion Table. And that's exactly the example I will use below!

1. Data Creation: Exporting Data from Google Analytics to Drive

As mentioned, I am going to show how to visualize GEO data; first because I think GEO is cool and I love maps, and second because I believe it is a rich example that showcases interesting parts of the Fusion Table product.

If you would like to follow this tutorial using your own data, start by visiting the GEO Location report (direct link) to download your data to Google Spreadsheets. Here is how to do it:

  1. Click the drop down on the bottom of your table that says Show rows and choose 500, it should be enough to include all available countries (or states, if you choose to analyze USA States for example).
  2. Above the map you will find a link that says Export, click on it and choose Google Spreadsheets.
  3. This will export the standard metrics. If you want to export Goal metrics, Ecommerce or AdSense metrics, click on the link just above the map and repeat 1 and 2 above.

Below are pointers to steps 2 and 3 from the list above.

Exporting Google Analytics data to Spreadsheets

Another way to export Google Analytics data into Google Spreadsheets is to use Nick Mihailovski's Magic Script. The Script allows you to build a table in Google Spreadsheets using the API, which will be very handy if you want to run this visualization more than once. I warmly recommend trying it. See below Nick's instructions on how to install it.

2. Data Manipulation: Creating A Master Google Fusion Table

Now we have two Google Spreadsheets and we want to merge them into one Fusion Table. However, as far as I could understand you can't just upload two Spreadsheets into one Fusion Table, you have to create two Fusion Tables and then merge them. So that's what we are going to do now.

First, open your Spreadsheets and delete the first few rows, the ones that contain the summary of the data and some empty rows. This will save time and keep the data organized.

In addition, delete the last column of the Spreadsheet, it is a summary of all data and can cause some confusion when merging the Spreadsheets.

Last, create a Fusion Table (direct link) and choose your Spreadsheet as the source. Click Next. You will see the first few rows of your data, make sure they look fine. Click Next. You will be requested to give your table a name and other technicalities. Click Finish.

You have a table!

Click on the Map of Country tab above the table. You should see a popup letting you know that the Google Maps Geocoding service is placing addresses on the map.

Google Maps Geocode

Now repeat the process above with your second Spreadsheet (if you have one).

Only then you should merge both tables by clicking on File > Merge in any of them. You will be prompted with a popup to choose the second table you want to merge (which should be in your Drive account as you created the table in there). Once you choose the table, you will be asked to Confirm the source of match which, in our case, will be Country / Territory.

Merging Tables on Google Fusion Tables

Choose the metrics that you would like to merge and you are done!

3. Data Enhancement: Merging Publicly Available Data to Google Analytics

Now that you have your Google Analytics data ready, you should consider merging it with public datasets offered through Fusion Tables; if you are using Country data tables you will find a few. To do so, click on File > Find a table to merge with... and you will see the following:

Merging offline data into Google Analytics

Note that just below the dataset name you will see a note: 98% of rows have a match - this will tell you if the dataset matches your data or not. It is up to you to decide which percentage is good enough.

Building Interactive Maps with Google Analytics

Now the fun begins! Here is a dataset I created using the data described above. I integrated some Google Analytics data with a dataset called World Data - Internet Users per Country, where I got country data for the number of Internet users, percent of population with access to Internet, the country flag and a few more columns.

The first visualization I decided to create is a map.

There are numerous ways to customize the map, starting from choosing between a Feature Map and a Heatmap. I chose a Feature Map as it enables us to add interesting information into the map; I believe the Heatmap feature is less valuable than what Google Analytics offers through through its interface, but I won't go into explanations...

Here are a few of my decisions when creating the visualization above, separated by the two main formatting categories: feature styles and info window.

Map Visualization

1. Feature Map Styles

In these options (labeled 1 at the screenshot above), you will be able to design the marker icon, colors, lines and legend. I believe the most interesting setting in this session is the marker icon. As you probably saw in the map legend, I used the color of the marker icon to indicate a country's traffic quality using Bounce Rates as a proxy. If the marker is red, bounce rate is above 66%, yellow is between 33 and 66% and green is below 33%. Below is a screenshot showing how to do it:

  1. Choose "Change feature styles..." just besides the map (#1 on screenshot above).
  2. Choose "Marker icon" on the left sidebar (#2 on screenshot below)
  3. Choose the tab called "Buckets"
  4. Choose the number of buckets you want to divide your data into (notice that the more buckets you use the harder it will be to differentiate between them in the map.)
  5. Choose the metric you want to use to differentiate the markers, e.g. visits, bounce rate, revenue... Also note that just below the drop down you are asked which range should be used; if it is a "rate" you should use "0 - 1".
  6. Choose the colors of your buckets and how they should be divided (if you want to separate them unequally.)
  7. Click Save

Formatting Google Maps marker icon

2. Feature Map Info Window

The info windows are the windows that appear when you click one of the markers, and you can customize them in almost any way you want, anything that can be done via HTML (no JavaScript though). When you click on the Change info window... option (two screenshots above), you are given two options: Automatic or Custom. I personally think that you can do better with Custom windows. Here are some of the things you can do (this is my custom code):

<div class='googft-info-window'  style='width: 240px; height: 240px; overflow: auto;'>
<img src="{Flag Image Link}" style="width: 160px; vertical-align: top;" />
<h2>{Country / Territory}</h2>
<img src="http://farm3.staticflickr.com/2860/12322054635_f91332e658_s.jpg" height="30" width="30" style="float:left; margin:0 5px 0 0" />Last quarter, we had a total of <b>{Visits} Visits</b> from {Country / Territory}, of which {% New Visits} were first time visits. <b>The percentage of visitors that made at least one purchase was {Newsletter Signup (Goal 2 Conversion Rate)}</b>. Visitors coming from {Country / Territory} stayed an average of {Avg. Visit Duration} on the website and {Bounce Rate} of them left the website without interacting with it (Bounce Rate).<br>
<br>
<img src="http://farm3.staticflickr.com/2861/12322214133_cf728ba9eb_o.png" height="30" width="30" style="float:left; margin:0 5px 0 0" />As a side note, {Country / Territory} has {Internet users} Internet Users and {percent of population} of the population has access to the Internet.<br>
</div>

As you will see, I am doing a series of customizations: adding images (both from the table using {placeholders} and from external sources), height, width, style and others. Again, the custom windows will accept anything that can be written using HTML.

Visualizing Data Using Custom Cards

Another interesting visualization feature on Fusion Tables are Custom Cards. They are less visually appealing, but they might be interesting in that they allow filtering and they show all the data open before you. It may be a better way to visualize more information for deeper analysis. Here they are:

The customization is very similar to the map info window described above, it will accept anything that HTML can do. Here is my example code if you would like a starting point.

<div class='googft-card-view' style='font-family: helvetica; height: 240px; width: 210px; padding: 20px; overflow: auto; background-color:WhiteSmoke; margin:2px;'>
<img src='{Flag Image Link}' height='60' style='vertical-align: top'/>
<h2>{Country / Territory}</h2>
<b>Visits</b>: {Visits}<br>
<b>Conversion Rate</b>: {Newsletter Signup (Goal 2 Conversion Rate)}<br>
<b>% New Visits</b>: {% New Visits}<br>
<b>Bounce Rate</b>: {Bounce Rate}<br>
<b>Pages / Visit</b>: {Pages / Visit}<br>
<b>Internet Users</b>: {Internet users}<br>
</div>

Publishing Your Visualizations Online

Embedding Google AnalyticsPublishing your data will be as strait-forward as doing the manipulations above, two clicks and BAM! However, if you want your visualization to be limited only to one website (e.g. your Intranet) you will need a
Google Maps API for Business Client ID (and here are the details if you have one). But if you just want to showcase your visualization for everyone to see, here is how you do it.

First of all, your data sharing settings must be either Anyone with the link or Public on the web, otherwise when the map is embedded it will return an error. To change your sharing settings, click on Share on the top-right bottom of your table and change the Who has access part of it to one of the settings above. Then click on the arrow shown on the screenshot to the left and choose between sending a link or embedding it in a website.

You are done!

Closing Thoughts

In this article I showed how to export Google Analytics data to Google Fusion Table, merge it with other datasets and then visualize it. I provided a common example where a map would be a great solution, but there are many other examples out there... If you think this is a valuable tool and would like to contribute an example to Online Behavior, let us know!

image 
Embedding Google Analytics

BigQuery Export for Google Analytics Premium

$
0
0
BigQuery Export for Google Analytics Premium

In June 2013, Google announced a new feature that enables the export of unsampled data from Views (Profiles) directly into BigQuery. Unlimited in terms of rows, and with additional fields such as visit identifiers and timestamps, it's the rawest form of Google Analytics data that has ever been available to digital managers and developers.

This opens up an opportunity for granular interactive analyses that can really take a business' Analytics capabilities to the next level. For a quick overview and a business case of BigQuery Export, check out the launch video from Google I/O.

By connecting two independently successful Google products, developers and analysts are now able to leverage the awesome interactive query system that is BigQuery (originating from Dremel, Google’s internal querying system) to analyse their unsampled Google Analytics data.

In this post, I will walk you through how this feature is put together, what it looks like, and some important things to be aware of when considering BigQuery Export as part of your Analytics solution.

Eager to see a real life example right away? Check out the video below where I combine Google Analytics data with internal CRM data.

How Does It Work?

Let's start by hashing out how this feature actually works. As data is collected from your website or application, in the shape of http requests, it flows through the Google Analytics servers and undergoes various stages of processing. Sessionization, such as the grouping of hits together into separate visits, and the application of the Filters you have set up, are examples of configurations that take place during these stages. Once the data is processed, it is made available in tables which are queried through, for example, the Google Analytics user interface. These are the basics of how the Analytics reports you are used to are populated with data.

Essentially, the BigQuery Export feature activates a mirror processing job during which this same data is also put into a designated BigQuery project. Once daily, currently at approximately 7am Pacific Time, this export job will commence for the previous day's data (and generally completes within 2-4 hours). This means that you should expect Monday's data to appear in your BigQuery project on Tuesday before noon (if you live on the US west coast).

BigQuery export model overview

Once in BigQuery, you can run SQL-like queries against multi-terabytes datasets in a matter of seconds, combine Google Analytics tables with tables from your data warehouse, and crunch hit- and session-level data in unprecedented detail.

What Does It Look Like?

Once a Google Analytics View is enabled for the export, a new dataset will be automatically created in your BigQuery project. The name of the dataset will be the View ID (the same as visible in the GA user interface under "View Settings"). A new table will be created for each exported day.

The schema, the structure in terms of fields available to include in your queries, are easily accessible directly in the BigQuery interface by clicking on a table. The schema is also documented in the Google Analytics help center.

The screenshot below shows the query field. The language is pretty intuitive even if you are unfamiliar with SQL: the current query means that we have selected to show visit ID, hit type, and hit number from all sessions in one particular table (representing one day’s data).

Tip #1: use the Validator to ensure that your query is valid before running it. This will save you a lot of time.

BigQuery export query

BigQuery returns the results below the query field. In the next screenshot, you can see the type of each hit, in which order that hit happened in the recorded session, and the ID for that session. Visible are three sessions; rows 3-5 are all within the same session since these hits share the same visit ID (remember, a visit is the same thing as a session, the words are synonymous).

BigQuery export results

Tip #2: Save your queries (using the "Save Query" button) to avoid the hassle of rebuilding them each time. Saved queries are available to you in the "Query History" section in the BigQuery UI

Food For Thought

Finally, I want to leave you with some useful frequently asked questions regarding this feature.

1. BigQuery Export is for Google Analytics Premium only?

The BigQuery Export feature is only available for Google Analytics Premium accounts (these are the only accounts that have access to unsampled data in the first place).

2. What are the row limits?

Good news: there are no row limits! Every single hit will be exported to these tables. Along with some extra fields, like visit ID's and timestamps, this feature consequently gives the most granular access to unsampled Google Analytics data available.

3. How much does it cost?

Note that there are costs associated with BigQuery usage and storage (details here). To give you some perspective, a billion Google Analytics hits equals about a terabyte of data. A terabyte of data costs 80 dollars to store per month in BigQuery. For Google Analytics Premium pricing, contact Google.

4. Do View Filters apply when using BigQuery export?

Note that since the export job mirrors that of the processing job for the tables you normally access through the GA user interface, the same filter configurations will also apply to the data being exported to BigQuery.

5. Can I export data from multiple Views per property?

No. Only one View per property may be exported. You should consequently pick the View that gives you the most value in terms of data. Unless you are concerned about the size of the export, arguably the best View to export is therefore one of your master Views or a raw, completely unfiltered View.

6. BigQuery Export is a Firehose

It's important to understand that once you enable this feature, the export job will run until disabled. This means that it will continuously fill your project with data, and there is no automatic deletion of old data. It is therefore a good idea to work out an internal process for how much data you want to store in BigQuery and regularly remove data that you no longer need to avoid unnecessary costs.

7. Classic vs. Universal Analytics

The feature can be enabled for both Classic and Universal Analytics Views (including App Views). However, fields which are only relevant for one of the platforms will not populate for the other. Custom Variables is one such example. These were replaced by Custom Dimension in Universal Analytics. Hence, if you export a Universal Analytics View, the Custom Variable field will be empty.

8. Should I use the Google Analytics API or BigQuery Export?

Without saying that one solution is necessarily better than the other (it depends on your own system and preferences), the BigQuery Export has several benefits. First of all, when using the API you are limited in terms of how many dimensions you can include in your query as you construct your reports. With the data in BigQuery there is no such limit, you can include whatever fields you want when you build a query (in fact, using * returns every single dimension and metric, although this would be an expensive query). Secondly, there are some additional fields that you cannot access through the API, such as the visitId or visitorId, which means the BigQuery Export data is even more granular. Lastly, even if you can match the automatic delivery of all sessions data into your own system, BigQuery is extremely powerful when it comes to rapidly processing multi-terabyte datasets that is hard to match for any BI system.

Closing Thoughts

BigQuery Export is fantastic news for anyone looking to crunch their unsampled Google Analytics data. From basic queries to advanced interactive analysis, the feature is extremely useful for anyone who wants to tap into granular, hit-level data. Although this is an Analytics Premium only feature, you may access a sample dataset by following the instructions here to try it out.

We are eager to hear your thoughts about BigQuery Export. If you got any particular use case you want to share, or just comment on the feature, please do so below!

image 
BigQuery export model overview
BigQuery export tables
BigQuery export query
BigQuery export results

Integrating Google Analytics & Fusion Tables [Tutorial]

$
0
0
Integrating Google Analytics & Fusion Tables [Tutorial]

In my last post I wrote about visualizing Google Analytics data using Fusion Tables. However, the method I used to export data from GA into Fusion Tables was very manual. A few days after publishing, Sreeram Balakrishnan, a colleague at Google, wrote me a note suggesting me to use an App Script to automate the process.

Sreeram went on to write the Script to populate Google Analytics data into Fusion Tables. Apart from making the process easier and cleaner, it allows doing some interesting things, such as updating the data daily using a trigger. In this article I provide a step-by-step guide on how to use the Script to get the data.

But if you prefer watching videos, here is a quick video guide.

Creating and Tweaking the App Script

First of all, create your own App Script, either by Making a copy of this file or by copy/pasting the script in the end of this article into a new project on https://script.google.com. Both ways will have the same result.

Once you create your Script, you will need to make only one change: search for var profileId = xxxxxxxx; and instead of xxxxxxxx you should add the View ID of the the View (profile) that you would like to use. In order to find your View ID, log in to Google Analytics and click on Admin on the top of the page; then click on View Settings of the View you want to export data from. Here is a screenshot showing how to get there:

Google Analytics view id

Tip for Pros

If you are into playing with the Google Analytics API, keep reading this section; if not, just skip to the next section (Advanced Google Services & Google Developers Console).

As you will notice if you take a quick look at the Script, you can tweak it to export the Dimensions and Metrics you are most interested in. Here is the snippet that you can play with to change the data that goes into your Fusion Table. In the original Script I used Visits, Average Time on Site and E-Commerce Transactions aggregated by country.

var metric = 'ga:visits,ga:avgTimeOnSite,ga:transactions';
  var options = {
    dimensions: 'ga:country',
    sort: '-ga:visits',
    maxResults: 500
  };

Advanced Google Services & Google Developers Console

Once you make the change above, you should make sure that the Script has access to the services needed. On your Script page, click on the top menu named Resources and then on Advanced Google Services. You will see the page below. Make sure to turn on the Google Analytics API and Fusion Tables API and then click on Google Developers Console.

Advanced Google services

When you land in the Google Developers Console (after clicking on step #3 above) you need to turn on the Analytics API and Fusion Tables API.

Now everything is ready to run the report. Here is how to run it:

Creating Fusion Table with Google Analytics

Your table will be created and added to your Google Drive. Head on to your Google Drive recent files and make sure to click on the arrow on the top right corner of the list and change it to Last Modified.

You will find your new table there, enjoy!

Refetching the Data or Automating its Refreshing Rate

Above I explained how to run the Script for the first time, where the Script will create a new table for the data. However, you might be interested in updating the data from time to time. In order to do that, after you run it for the first time, go to your newly created table, click on File > About this table and copy the table Id.

Once you have the table Id, go back to your Script and search for var ft_tableId = ''; - add the Id inside the apostrophes. Now, every time you click on getData it will re-populate your table with the updated data.

If you really want to build an automated solution, you can add a trigger to update your data every day without having to do it manually. Here is more information on how to set up such a trigger.

App Script to Export Google Analytics Data to Fusion Table

// Run Analytics report to get num views broken down by country
// If ft_tableId is supplied, replace contents with output of report
// otherwise create a new table with contents of the report
function buildReport(profileId,ft_tableId) {
  var today = new Date();
  var oneWeekAgo = new Date(today.getTime() - 7 * 24 * 60 * 60 * 1000);

  var startDate = Utilities.formatDate(oneWeekAgo, Session.getTimeZone(),
      'yyyy-MM-dd');
  var endDate = Utilities.formatDate(today, Session.getTimeZone(),
      'yyyy-MM-dd');

  var tableId  = 'ga:' + profileId;
  var metric = 'ga:visits,ga:percentNewVisits,ga:avgTimeOnSite';
  var options = {
    dimensions: 'ga:country',
    sort: '-ga:visits',
    maxResults: 500
  };
 
  var report = Analytics.Data.Ga.get(tableId, startDate, endDate, metric,
      options);

  if (report.rows) {
    // Append the headers.
    var headers = report.columnHeaders.map(function(columnHeader) {
      return columnHeader.name;
    });
    Logger.log('Headers: %s', headers);
    if (typeof ft_tableId === 'string' && ft_tableId.length == 40) {
      // Fusion Table supplied. Replace contents
      var result = FusionTables.Query.sql('delete from ' + ft_tableId);
      Logger.log("deleted %s rows from table %s",result,ft_tableId);
      Utilities.sleep( 1000);
      var rowsAsCSV = Utilities.newBlob( rowsToCSV( report.rows),'application/octet-stream');
      return FusionTables.Table.importRows(ft_tableId, rowsAsCSV);
    } else {
      // No table specified create a new table
      report.rows.unshift(headers);
      var rowsAsCSV = Utilities.newBlob( rowsToCSV( report.rows),'application/octet-stream');
      return FusionTables.Table.importTable('Report for '+tableId, rowsAsCSV);
    }
  }
}

function getData () {
  var profileId = xxxxxxxx;
  var ft_tableId = '';
  var result = buildReport( profileId, ft_tableId);
  Logger.log("result of report is %s",result);
}

// Apply CSV escaping to single cell value
function escapeToCSV( value) {
  var svalue = typeof(value) === 'string' ? value : value.toString();
  if (svalue.indexOf(',') != -1 || svalue.indexOf("\n") != -1) {
    return '"'+svalue.replace(/"/g,'""')+'"';
  }
  else {
    return svalue;
  }
}

// Convert single row to CSV
function rowToCSV( row) {
  return row.map(escapeToCSV).join(",");
}

// Convert array of rows to csv
function rowsToCSV( rows) {
  return rows.map(rowToCSV).join("\n");
}
image 
Google Analytics view id
Advanced Google services
Creating Fusion Table with Google Analytics

Visualizing Google Analytics Data With R [Tutorial]

$
0
0
Visualizing Google Analytics Data With R

In the last few weeks I have been quite immersed in data visualization, trying to understand how it can be used to turn data into insights. As part of my immersion, I have played with Fusion Tables and Google Analytics, and also other ideas that will come to light in the future... As I wrote in the Fusion Tables article, I think everyone secretly wishes to do crazy visualizations with Google Analytics data sometimes, both because it can very insightful or just incredibly fun :-)

And here I am again, with another custom visualization! But this time I decided to use the R programming language, which is considered to be one of the best options when it comes to statistical data visualization.

As I looked deeper into R, I tried to understand what kind of visualizations would complement Google Analytics (GA), i.e. what can we get out of R that we can't currently get out of GA. My first idea was to try and create a visualization that would allow me to look at my top 5 US states by number of visits (or countries if you wish) and see how they are performing side by side. In addition, I wanted to see how Christmas and a TV campaign affected the behavior across US States. While this is possible to understand using Google Analytics, I believe it would not be possible to visualize it in such a way.

Once I found this interesting use case, I decided to take my artistic capabilities out of the rusty box and sketch the output I was looking for... and here is what I got.

Data Visualization sketch

With this objective in mind, I rolled up my sleeves and started working... Below is a step-by-step guide on how to build a very similar visualization using your own Google Analytics data. If you know your way through R, you can simply download this commented txt file.

Important: please note that while I try to describe the process as detailedly as possible, an introduction to R would be very recommended. If you have some time to invest try the Computing for Data Analysis Coursera course, or just watch the YouTube playlist Intro to R. I am also providing a list of helpful books in the end of the article.

Installing R, the Google Analytics package and others

If you are completely new to R, you will first need to download R and follow the instructions to install it. After you do that, I recommend you also install R Studio, a great tool for you to write and visualize R code.

Now download the Google Analytics package into your R workspace (below I am using version 1.4). If you don't know where is your workspace just type the line below into your console.

getwd()

Enter the following lines into R to install and load the respective packages, they are necessary for this visualization.

install.packages(c("RCurl", "rjson", "ggplot2", "plyr", "gridExtra", "reshape"))
require("RCurl")
require("rjson")
require("ggplot2")
require("plyr")
require("gridExtra")
require("reshape")
require("RGoogleAnalytics")

Getting the data and preparing it for visualization

Step 1. Authorize your account and paste the accesstoken - you will be asked to paste it in the console after you run the second line below.

query <- QueryBuilder()
access_token <- query$authorize()

Step 2. Initialize the configuration object - execute one line at a time.

conf <- Configuration()

ga.account <- conf$GetAccounts()
ga.account

// If you have many accounts, you might want to add "ga.account$id[index]" (without the ") inside the ( ) below to list only the web properties inside a specific account.

ga.webProperty <- conf$GetWebProperty(ga.account$id[9])
ga.webProperty

Step 3. Check the ga.account and ga.webProperty lists above and populate the numbers inside [ ] (i.e., substitute 9 and 287) below with the account and profile index you want (the index is the first number in each line of the R console). Then, get the webProfile index from the list below and use it to populate the first line of step 5.

ga.webProfile <- conf$GetWebProfile(ga.account$id[9],ga.webProperty$id[287])
ga.webProfile

Step 4. Create a new Google Analytics API object.

ga <- RGoogleAnalytics()

Step 5. Setting up the input parameters - here you should think deeply about your analysis time range, the dimensions (note that in order to do a line chart for a time series you must add the "ga:date" dimension), metrics, filters, segments, how the data is sorted and the # of results.

profile <- ga.webProfile$id[1]
startdate <- "2013-12-08"
enddate <- "2014-02-15"
dimension <- "ga:date,ga:region"
metric <- "ga:visits, ga:avgTimeOnSite, ga:transactions"
filter <- "ga:country==United States"
sort <- "ga:date"
maxresults <- 10000

Step 6. Build the query string, use the profile by setting its index value.

query$Init(start.date = "2013-12-08",
           end.date = "2014-02-15",
           dimensions = "ga:date, ga:region",
           metrics = "ga:visits, ga:avgTimeOnSite, ga:transactions",
           sort = "ga:date, -ga:visits",
           filters="ga:country==United States",
           max.results = 10000,
           table.id = paste("ga:",ga.webProfile$id[1],sep="",collapse=","),
           access_token=access_token)

Step 7. Make a request to get the data from the API.

ga.data <- ga$GetReportData(query)

Step 8. Check your data - head() will return the first few lines of the table.

head(ga.data)

Step 9. Clean the data - removing all (not set) rows.

ga.clean <- ga.data[!ga.data$region == "(not set)", ]

Step 10. Choose your data - get the data for the specific states (or countries) that you want to analyze. Notice that I am using only the Top 5 countries as I think more than that would be a bit too much to visualize, but it is up to you.

sum <- ddply(ga.clean,.(region),summarize,sum=sum(visits))
top5 <- sum[order(sum$sum,decreasing=TRUE),][1:5,]
top5

Step 11. Build the final table containing only the countries you want.

d <- ga.clean[ga.clean$region %in% c("California", "Texas", "New York", "Florida", "Illinois"),]

Building the visualization: legends and line charts

Step 12. Build the special campaign bars and legend (in this case Christmas and Campaign)

g_legend<-function(a.gplot){
  tmp <- ggplot_gtable(ggplot_build(a.gplot))
  leg <- which(sapply(tmp$grobs, function(x) x$name) == "guide-box")
  legend <- tmp$grobs[[leg]]
  return(legend)}

rect_campaign <- data.frame (
  xmin=strptime('2014-01-25',"%Y-%m-%d"),
  xmax=strptime('2014-01-30', "%Y-%m-%d"),
  ymin=-Inf, ymax=Inf)

rect_xmas <- data.frame (
  xmin=strptime('2013-12-25',"%Y-%m-%d"),
  xmax=strptime('2013-12-26', "%Y-%m-%d"),
  ymin=-Inf, ymax=Inf)

fill_cols <- c("Christmas"="red",
               "Campaign"="gray20")

line_cols <- c("avgTimeOnSite" = "#781002",
               "visits" = "#023378",
               "transactions" = "#02780A")

Step 13. Build the chart legend and axis.

get_legend <- function(data) {
  d_m <- melt(data,id=c("region", "date_f"))
  p <- ggplot() +
    geom_smooth(data = d_m, aes(x=date_f, y=value,group=variable,color=variable),se=F) +
    geom_rect(data = rect_campaign,
              aes(xmin=xmin,
                  xmax=xmax,
                  ymin=ymin,
                  ymax=ymax,
                  fill="Campaign"), alpha=0.5) +
    geom_rect(data = rect_xmas,
              aes(xmin=xmin,
                  xmax=xmax,
                  ymin=ymin,
                  ymax=ymax,
                  fill="Christmas"), alpha=0.5) +
    theme_bw() +
    theme(axis.title.y = element_blank(),
          axis.title.x = element_blank(),
          legend.key = element_blank(),
          legend.key.height = unit(1, "lines"),
          legend.key.width = unit(2, "lines"),
          panel.margin = unit(0.5, "lines")) +
    scale_fill_manual(name = "", values=fill_cols)  +
    scale_color_manual(name = "",
                       values=line_cols,
                       labels=c("Number of visits", "Average time on site","Transactions"))
  legend <- g_legend(p)
  return(legend)
}

Step 14. Build the charts!

years <- substr(d$date, 1, 4)
months <- substr(d$date, 5, 6)
days <- substr(d$date, 7, 8)
d$date_f <- strptime(paste(years, months, days, sep="-"), "%Y-%m-%d")
d$date <- NULL
d$X <- NULL

l <- get_legend(d)

p1 <- ggplot(d, aes(x=date_f, y=visits,)) +
  geom_line(colour="#023378") +
  ggtitle("Number of visits") +
  geom_rect(data = rect_campaign,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="grey20",
            alpha=0.5,
            inherit.aes = FALSE) +
  geom_rect(data = rect_xmas,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="red",
            alpha=0.5,
            inherit.aes = FALSE) +
  facet_grid (region ~ .) +
  theme_bw() +
  theme(axis.title.y = element_blank(),
        axis.title.x = element_blank(),
        panel.margin = unit(0.5, "lines"))


p2 <- ggplot(d, aes(x=date_f, y=avgTimeOnSite,)) +
  geom_line(colour="#781002") +
  geom_rect(data = rect_campaign,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="grey20",
            alpha=0.5,
            inherit.aes = FALSE) +
  geom_rect(data = rect_xmas,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="red",
            alpha=0.5,
            inherit.aes = FALSE) +
  facet_grid (region ~ .) +
  ggtitle("Average time on site") +
  coord_cartesian(ylim = c(0, 250)) +
  theme_bw() +
  theme(axis.title.y = element_blank(),
        axis.title.x = element_blank(),
        panel.margin = unit(0.5, "lines"))

p3 <- ggplot(d, aes(x=date_f, y=transactions,)) +
  geom_line(colour="#02780A") +
  facet_grid (region ~ .) +
  geom_rect(data = rect_campaign,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="grey20",
            alpha=0.5,
            inherit.aes = FALSE) +
  geom_rect(data = rect_xmas,
            aes(xmin=xmin,
                xmax=xmax,
                ymin=ymin,
                ymax=ymax),
            fill="red",
            alpha=0.5,
            inherit.aes = FALSE) +
  ggtitle("Number of transactions") +
  theme_bw() +
  theme(axis.title.y = element_blank(),
        axis.title.x = element_blank(),
        panel.margin = unit(0.5, "lines"))

grid.arrange(arrangeGrob(p1,p2,p3,ncol=3, main=textGrob("US States: Website Interaction & Commerce", vjust=1.5)),l,
             ncol=2,
             widths=c(9, 2))

Phew! Here is the chart you should get!

The chart is not exactly what I initially thought, but having all metrics in one chart was a bit problematic as the scale was too different and we would barely see the transactions chart. But I like it this way :-)

Google Analytics R Visualization

If you already use R to analyze and visualize Google Analytics data, send us an email, we would love to publish other examples.

Books to learn R

  1. Learning R
  2. R Graphics Cookbook
  3. ggplot2: Elegant Graphics for Data Analysis
  4. Discovering Statistics Using R
image 
Data Visualization sketch
Google Analytics R Visualization
Google Analytics R Visualization

Customer Insights With Google Analytics Demographics

$
0
0
Demographics insights

Cardinal Path articleDemographics reporting inside Google Analytics is another recent improvement to Google Analytics that levels the playing field for small and medium sized firms.

At Cardinal Path we are often breaking new ground with our clients using the latest digital measurement technologies from Google, and we are surfacing the new demographic data in almost all client engagements. We are also "eating our own dog food" by using this demographic data in our advertising tactics, and more broadly as intelligence in shaping our messaging, and our product offerings. This article describes Google Analytics demographics, and highlights what we have learned.

Background

Google Analytics (GA) is rapidly evolving to be a marketer's best friend. Even the main menu of Google Analytics has changed from Web Analytics jargon to reflect marketing language: Audience, Acquisition, Behaviour, & Conversions. Now Google Analytics is taking it one step further by including demographic and psychographic information inside Google Analytics.

Yes, that's right, now your site data is being "enriched" so that you can see how men and woman of different age groups use your site and online services - and react to your marketing efforts, differently.

What Google is doing here is creating value for Marketers by connecting the marketer to valuable information about audiences and customers. On the back-end Google is connecting data between Google Analytics and its advertising system. This means that when Google has classified an internet user (gender, age group), it can share this information with Google Analytics, and surface the data for your marketing department. This is a manifestation of the connection economy described by Seth Godin: Connecting databases to create value in the cloud, and connecting marketers to this useful information about their audiences and customers.

What is the benefit of demographic data in Google Analytics?

Brandon Lewin, Marketing Manager for the Cardinal Path Training Academy says, "Demographic data helps us provide timely and relevant information to our target audience. With access to insightful data on our customers and prospects, not only can we be more efficient with our marketing programs, but we can also make improvements to our courseware."

This is a brand new, powerful feature from Google, and your competitors may not be leveraging this for marketing insights - yet. You will get to know more about your customers and prospects, and this will help you tailor your products, services, and marketing efforts.

To get maximum value from this new capability, marketers will need to connect this demographic and psychographic data to:

  • Marketing strategy and advertising targeting tactics (fortunately any targeting done in Google AdWords or DoubleClick will match 100% the demographic information in Google Analytics (yes, this is a big deal))
  • Other audience information in Google Analytics (fortunately, Google Analytics is a great analysis tool that allows marketers to slice and dice the data in almost any way imaginable)

Historically, Google Analytics could answer the following questions:

  1. WHAT visitors did on our site (purchases, downloads, reading etc.)?
  2. WHAT content visitors liked?
  3. WHEN they visited the site (timestamps)?
  4. WHERE our Website visitors were located geographically?
  5. WHY they visited our site (and what intent they had (keywords & landing page analysis))?
  6. HOW they found our Website?

The new opportunity is the WHO, and being able to mix this traditional analytics data with the demographics data to gain new insights into audiences and marketing initiatives.

How to leverage demographic and psychographic data in Google Analytics?

Example 1 – Overall site usage

Based on the information provided in the gender report below, we can see:

  1. There are about three times more men than woman using this Website.
  2. Site engagement is about equal on average for men and woman.
  3. Ecommerce conversion rate for woman is about double that of men.

Demographic site usage

*Note that for the 105k site visits we have gender and age information on more than half of these (54k)

**Note also that statistical significance requires that there be large enough sample of data to form conclusions – usually more than several hundred transactions for ecommerce.

Example 2 – Ecommerce Demographics

Based on the information provided in the ecommerce report below, we can see:

  1. The distribution of our ecommerce revenue across genders and age groups.
  2. The ecommerce conversion rate and average value of each visitor in that age gender group (Per visit value).
  3. The average order value for visitors in each group.

Ecommerce Demographics

Example 3 – Psychographic Conversion Trends

The Google Analytics demographics reports also includes an "affinity" category that gives insight into the lifestyles and hobbies of our audiences. This is where we may find some out-of-the-box marketing opportunities. Comparing the average value of visitors with different interests, we can identify Cooking Enthusiasts, Sports Fans, and Travel Buffs as tending to have a higher average value in terms of ecommerce revenue.

Psychographic Conversion Trends

Example 4 – Demographics in User Segments

Dissecting your traffic to isolate a niche is really "low hanging fruit" for audience/customer insights. Now demographic data can be combined with all the other visitor data to produce new marketing capability. Let's say the marketing department is interested in middle-of-career males who play sports, we can now isolate this segment in our traffic with a few clicks...

Demographics User Segments

Note also that we could specify the language setting of the browser as well as the continent/country/region of the target market. Or we could be more specific and target only football or hockey enthusiasts. Both the affinity categories and the ‘other' categories can be used to be more precise.

Once an user segment is turned on (as shown on the image above), you can browse all your other Google Analytics reports to compare the behaviour and performance of this segment to other traffic segments (including your traffic sources, goal conversions, and ecommerce).

You can achieve more control over the demographic segment by choosing "Conditions" under user segments, and specifying the data fields and the Boolean logic to govern the definition of the visitor group. The image below shows how we use the OR statement to ensure we include all visits with sports under "Affinity", or the "Other" categories (larger group). Be cautious that your segment does not filter in only those visits with "sport" under both categories (smaller group).

Demographic advanced segment

Actionable Intelligence! Take Action!

Each of the above examples help us to identify valuable visitors - people who tend to spend money with us online. Woman aged 35 to 44 appear to be a lucrative target market, as are cooking enthusiasts, and if we surface the gender of cooking enthusiast, we see that the female portion has a much stronger "buy" signal...

Actionable Intelligence

Now we can take these insights back to the boardroom and brainstorm a strategy to promote our products and services to these audience segments. This demographics data can inform many aspects of our marketing programs.

  • Goal conversions - segmenting audiences to understand which groups convert against which business goals online, for example, do woman share our content in social media and sign up for newsletters more frequently?
  • Content preferences - which gender and age groups respond best to different types of content, or authors?
  • Product preferences - which product pages get the most traffic from each demographic, and what are the conversion rates for each product / product line?
  • Campaign / messaging effectiveness - which ad messages have the highest click-through rate and conversion rate with various market demographics?
  • A/B testing - did the optimization testing prove to be more effective with woman or men?

These are all questions we can now answer for a very large sample of the audience (in the range of 50% of the audience), not just those who converted.

Advertising tactic #1: Adwords

Also remember that Google's advertising system is the source of this audience categorization, and this means if we use this same system we can be assured that we are reaching the target... Huzzah! There is a "one-to-one" relationship between the demographic data in Google Analytics and the ad targeting system of Google AdWords.

So, as you build out your cost-per-click campaign and individual ad groups you can explicitly include or exclude any of these demographic or psychographic segments. This is what I call "evidence-based hyper-targeting", or "smart advertising" for short.

Bethany Bey is in charge of CPC advertising for Cardinal Path training, and says that using demographics targeting holds huge benefits for organizations looking to take their Adwords campaigns a step further. "Using demographics will let us better target our ads, but more importantly, give our audience ad content that will stand out because it's meaningful on a new level", she says.

In the screenshot below you can see how to choose Interest Categories, you can read more about targeting these Google Analytics demographic segments in AdWords here:

AdWords remarketing list

Advertising tactic #2: Remarketing

The introduction of demographics to Google Analytics also allows advertisers to streamline ad spending by targeting advertisements to the right audience with greater accuracy. For example, if male sports fans are the target audience for an ad campaign you can choose to remarket only to that group. To do this simply go into the "admin" section of Google Analytics, and under each property ID you will see the menu option for remarketing lists.

Analytics remarketing

When you click on "Lists" you will be able to create a "New Remarketing List"

Next, select the radio button "Create my own remarketing type using Segments". Finally, choose "Import" to select the advance segment you have built around a demographic group.

Remarketing type

Alternatively, you can specify the demographic details for your remarketing list at this point in the process, just below the import button.

Demographic remarketing list

So how do I get demographics in my Google Analytics data?

If you are already using remarketing with Google Analytics, then no change is required – both demographics and remarketing are enabled with one small change to your tracking code. Otherwise you will need to (a) change one line of code in your Google Analytics page tag, and (b) you will need to update your privacy policy (to be transparent). The instructions for doing do are in this Help Center article.

If you are using the newest version of Google Analytics, called Universal Analytics, then expect the release of this new feature soon!

Closing Thoughts

In this article we looked at the new demographic and psychographic segments in Google Analytics, and we learned how we can use this information to help inform our marketing and advertising initiatives. Our examples focused largely on ecommerce, but we can also take what we learn about our customers into many other areas of our business. Using user segments in conjunction with audience/customer demographics opens up a whole new realm for web analysts and marketers.

Happy segmenting!

image 
Demographic site usage
Ecommerce Demographics
Psychographic Conversion Trends
Demographics User Segments
Demographic advanced segment
Actionable Intelligence
AdWords remarketing list
Analytics remarketing
Remarketing type
Demographic remarketing list

Analyzing Campaigns with Cost Data Upload

$
0
0
Cost Data in Google Analytics

Google Analytics allows us to easily integrate our Google AdWords campaign data for reporting and analysis. This makes sense, as Google wants us to be able to understand engagement and performance metrics after somebody lands on our website, or downloads our app, from a paid ad. They're also both Google products too, so integration helps!

But in order to truly understand the performance of our AdWords campaigns we need to be able to evaluate success within the context of all our marketing initiatives. For example, if we are running paid campaigns through AdWords, Facebook, LinkedIn and Twitter we really need a way to understand how the conversion rate from our AdWords campaigns compares to our other advertising. We can then start to get an idea of the kind of Return On Investment (ROI) that different marketing initiatives are generating, and use that information to make smart decisions about future budget allocations.

We have always had the ability to export AdWords data or pull out data from Google Analytics in order to compare campaign performance. You might have been doing this in Excel or Google Drive, and you'd know it can be a bit of a tedious process! It also meant that your analysis was limited to the metrics that you had exported originally.

This is where Google Analytics Cost Data Upload comes in. It allows you to import the extra pieces of data belonging to your non-AdWords paid campaigns. This means you can quickly evaluate the performances of all your campaigns, compare advertising costs and ROI right within the Google Analytics interface.

The reports that are created from your uploaded data won't provide the same granular detail available within the AdWords reports (think ad groups and individual keywords), but it will enable you to start performing top-level analysis within Google Analytics. Another thing to remember is that the reports also rely on people actually traveling through to your website, so if you have impression data without a click then you won't get additional insights from the reports.

Steps to upload Cost Data into Google Analytics

Steps to upload Cost Data into Google Analytics

Before we start, it's important to understand that custom data is uploaded to a property inside Google Analytics and is then applied to one or more views. If you have a large-scale implementation, then it means you will have to repeat the process if you want data available within views that are contained in different properties. For most cases this won't be needed, but it is good to remember that upload is at the property-level and not the account-level.

Cost Data upload explained

Another critical thing to check is that you are using campaign tags on your inbound links to ensure that you can match your cost data to data that is available within your Google Analytics reports. The campaign tags you use will become the key that you use to combine the data together inside your reports.

If you are just getting started, then you will need to learn about Google Analytics campaign tags and begin using them before you can begin uploading cost data.

Step 1: Setup Custom Data Source

The first step is to setup the custom data source. After you have signed in to your Analytics account, navigate to the Admin section and select the appropriate property.

Within the property, select 'Custom Definitions', then 'Custom Data Sources', and then the 'New Custom Data Source' button. Now you will need to name the data source, provide a description and select the view(s) where you want this data to be available.

For example, if you are going to upload cost data from ads on LinkedIn you would create a new data source called 'linkedin' and this will then store all of your LinkedIn data. In most cases you will need to set up multiple custom data sources for each of your advertising channels. This means if you advertise on LinkedIn, Facebook and Twitter you would need to create three separate data sources.

You can create up to 25 custom data sources within Google Analytics for uploading campaign cost data.

Each custom data source you create will have a unique ID or key which is used to ensure that data that is uploaded ends up in the right data source. Data can be uploaded daily to the custom data source and this can be viewed by selecting 'History' to view all historical data uploads.

Step 2: Download Data

Now you will need to download your cost data for each custom data source you have created. For example if you are going to upload data for your LinkedIn ads you will first need to download this from LinkedIn Campaign Manager. You can find this under the 'Reporting' tab.

Uploading LinkedIn data into Google Analytics

Ideally you will download a CSV containing the data. If this is not available then you can go with another format, but it might mean a bit more work formatting the data.

Step 3: Configure Data Set & Format Data

Once you have the data we need to head back into Google Analytics and configure the data set so that when we import our cost data, Google Analytics reads the correct data into our reports and also assigns our cost data to the correct ads.

Heading back to the admin section, find the property where we set up the data source in our first step and select 'Data Import'. From here we click the name of the custom data source we originally created. For our example we will click 'LinkedIn'.

Then we need to select the columns of data that we have available from the data that we downloaded in step two. At a minimum we need 'Medium', 'Source' and either 'Impressions', 'Clicks' or 'Cost', but the more columns we can create (based on available data), the more insights we will have when we head to our reports.

If we open the report downloaded for our LinkedIn example, we can see we have columns for 'Impressions', 'Total Clicks' and 'Total Spend' which we can use for 'Impressions', 'Clicks' and 'Cost'. Since this data is from LinkedIn we can also assume that 'Medium' will be CPC as we are paying for the clicks and 'Source' is linkedin.com because that is where our ads have been displayed. For LinkedIn we can also establish 'Ad Content' for each individual ad variation and 'Destination URL' which is the landing page we were sending traffic to from the ads.

It's important to know that if you select 'Medium', 'Source', 'Campaign', 'Ad Content' or 'Keyword' for columns in your data set, each of these needs to directly correspond to UTM campaign tags that are already available within your reports (before you upload data).

For example, if you didn't set any UTM campaign tags for your LinkedIn campaigns, then even if you define values for these parameters within your CSV for upload, the data won't correspond to anything within your reports.

When you are happy with the columns, click the 'Get Schema' button. This will show you the first line that is needed within your CSV file. You can also click the 'Download Schema Template' button to get this in Excel format.

Once we have the header for our CSV file we need to get all of the data into that format. For example, if your schema header looks like:

ga:medium,ga:source,ga:adClicks,ga:adCost,ga:impressions

Then this means the first column will contain the medium, the second the source and so on. If you are working in a spreadsheet (before exporting to CSV format), then you will have something along the following lines:

Cost Data Upload CSV

Then you will need to get the data you have downloaded into the correct format. For example:

Cost Data Upload table

If you have been working in Excel or Google Drive, you will then need to export this as a CSV ready for upload. This will give you a file that contains something like:

ga:medium,ga:source,ga:adClicks,ga:adCost,ga:impressions
cpc,linkedin.com,135,302.4,10732
cpc,twitter.com,93,90.21,3090

We are now ready to upload the data into Google Analytics.

Step 4: Upload Data

To get our data into Google Analytics we need to use the Management API to upload the data. Currently, there is no option to upload data directly within the Google Analytics interface, hopefully this will become available in the future for people just looking to upload small amounts on an ad-hoc basis, but for now we need to use the API.

If you are in a marketing or non-technical role, then you probably don't want to go through the process of creating your own method for uploading data using the API. There are existing tools that allow you to upload data without having to get into any code.

For those starting out, our friends at LunaMetrics have created a cost data upload tool that uses a simple interface to allow you to upload your file using the API after you have logged into your Google Analytics account and authorised the tool.

There are also a number of paid tools that allow you to upload cost data, including:

Analyzing Marketing Campaigns

From here we can begin analysing our campaign performance within Google Analytics. The 'Cost Analysis' report is available within the Acquisition section. The report will automatically calculate CTR (Click Through Rate) and average CPC (Cost Per Click) from your click and impression data.

It's important to ensure you have goals configured (and ecommerce if you are selling online) as this will give you the data that you need in order to analyze performance, including RPC (Return Per Click), ROI (Return On Investment) and Margin. When using these metrics remember that they are calculated against your advertising cost, so they will not include any costs or investment other than the actual amount you have paid for the clicks. You might want to consider exporting this data into Google Drive and perform your own calculations, so you can factor in elements like human resource (the amount of time to create and manage your campaigns), profit margin (if you are focused on ecommerce transactions) and even CLV (Customer Lifetime Value).

Whatever you decide, you want to identify campaigns that have a higher likelihood to convert, along with higher value (where you look at ROI and CLV). Once you identify your top performing campaigns you can begin to perform more in depth analysis and start incrementally testing campaign changes to see if you can improve performance.

image 
Steps to upload Cost Data into Google Analytics
Cost Data upload explained
Uploading LinkedIn data into Google Analytics
Cost Data Upload CSV
Cost Data Upload table

10 Ways To Improve Google Analytics Data Accuracy

$
0
0
Google Analytics Data Accuracy

Cardinal Path articleWe've all been there. Everything on surface looks like it's running smoothly. Data is coming in. The 30,000-foot view of your account looks like business as usual. You start upping your analytics game. Maybe you took some training and you're getting your hands dirty asking the tough questions of your data. But how do you know if you can trust your data in the first place?  

Before I dive into the various reasons your data can be messed up, let me define a specific element of "messed up". As an example, within each of the various reports that Google provides to you there is what's called the "Explorer" view. This view is the most commonly used. At the bottom of this report you have the data table which is broken down into columns and rows. Each one of these rows is unique. Therein lies the problem. Conceptually, what you understand as a single campaign, page of your site, source, medium, or whatever may be broken up into multiple different rows. Generally the most active row will rise to the top. That active row may only be a portion of the true data.  

Read on to discover ten ways your data may be secretly compromised.

10. No view filters

View filters do a wonderful job of segmenting your data into nice little buckets for analysis. Newer or rookie users don't leverage these as much as they should. While on the surface people think "segmentation" when it comes to employing filters, the vast majority of filters I apply to my views are on clean-up detail. Search and replace filters do a great job of making those page request URI's humanly readable. Uppercase and lowercase filters excel at ensuring report rows don't split since those are case sensitive. Use filters to clean up your data, not just segment it. View filters are the catch-all solution to cleaning your data.

Learn more about Google Analytics View Filters.

9. Filters

On the flip side, the most messed up accounts I've encountered do employ filters, but poorly. Always keep in mind that filters are destructive. They irrevocably alter your data as it is coming into the GA reporting database. The solution to this is having a deployment process to your filters and always having a single view that is unfiltered and untouched by any destructive options in configuration. Create a quick and dirty testing view, throw your filters in there first, and vet the data. The lack of a deployment process is very harmful to your historical data.

Always have a "raw data" view on your account.

8. Self referring

If you go to your referrals report and see your own domain as a referral, you've got a self-referring issue. Why is this a problem? Because you have absolutely no idea where those people actually came from. All that traffic is now unattributable to your marketing efforts. Gets even worse when you've spent hard cash on a marketing effort and you have no idea what percentage of that traffic makes up the self referrers.  

There's a ton of information regarding how to fix this out there, but I can tell you the most common reason this happens is one or more pages of your website lack the Google Analytics Tracking Code (GATC). If someone over the course of their visit hits a page on your site that has no tag, the previous page will count as an exit, and the next page will count as an entrance with your domain as the referral. The visit from that point on will start over again.  The most common page lacking tags from my experience is the 404 error page.

Learn more about common causes for self-referrals in Google Analytics.

7. No stripped query parameters

Query parameters are simply values added to the URL in the format of ?key=value&key2=value2. Query parameters can be stripped out view-by-view in the view settings page of the admin menu. These values can mean a wide array of things and it is very common for content management systems (CMS) to use these as well as external tracking applications. Query parameters don't always dictate what the content of the page is. Sometimes it's just session or user data the CMS needs to shuffle around page to page. Important thing to note: they hold no value to you as an analyst.

Keeping in mind reporting rows, each reporting row in your content reports is unique. Every query parameter and combination of them will be its own row which is going to split the data for the page into multiple rows. Odds are you're only looking at the row for a particular page that is most utilized and surfaced by column sorting. The metrics in that row may not be accurate since there could be any number of trailing rows for that page being split by query parameters. To see if you have this problem, simply use your inline filter in your content report to search for "?". That will surface all the entries with a query parameter.

To exclude URL query parameters, click on "Admin" on the top navigation and choose "View Settings" for the View you want to change.

6. No default view page

Another content clean up trick which is incredibly easy is setting your default page in your view settings. What this does is aggregate the root domain "/" with whatever page you dictate as your default page. For example if you tell it "index.php" is your default page, the "/index.php" in your content reports will now be lumped in with "/" into one clean row. Any additions, such as query parameters or subfolders, will still break out into their own row. That's a good thing!

To add a default page, click on "Admin" on the top navigation and choose "View Settings" for the View you want to change.

5. Goofy architecture

Aside from bad data, you might even be looking in the wrong place. As a developer I'm a stickler for good naming conventions. They should be consistent, easy to understand, and human readable. If you open your account and can't explain in 10 seconds what each view, web property, and account is then you have a real problem. Account setups can range from the most basic to the uber complex. Take a real big step back and figure out how everything ties together and if it's the best setup for you before you fully trust the data.

Here are some naming conventions for GTM, but they can be equally applied to GA.

4. No annotations

The chief benefit to annotations is compensating for changes to your account when viewing historical data. You probably remember the last 3 months of any sort of changes to your GA account. You could explain away spikes in traffic or modifications here and there. But what about a year from now?  You can't remember all the filters or development changes that affected the data which is why annotations are important. When you're building out those historical reports you'll have a nice neat log of what the data means so you can get true insights.

Check this feature walk-through.

3. Case sensitive campaign fumbles

This is a big one that view filters can fix. Campaign tracking in GA is a great feature; however, it does require a level of project management to ensure the data coming in stays accurate. Campaign tagging your links is case sensitive,  meaning if you launch three campaigns and you list the medium as EMAIL, Email, and email, those will show up as three completely different mediums.  It's highly recommended that you have some sort of communal document that your marketing team can access and collaborate on when crafting campaign tags but as an added measure you can always add a lowercase campaign/medium/etc view filter and solve the problem 100%.

Check out the URL Builder, an useful tool to build campaign tags.

2. Code in the wrong places

The best practice of placing the GATC on your website, much like the technology itself, changes.  I would love to just tell you right here and now the right place to put it but for the sake of making this article timeless, I'll tell you how to always get it right.  When you create a web property, or within your GA admin panel where the tracking code is located, is a brief description of exactly where it needs to go. When it comes to troubleshooting bad data this is one of the very first things I look at and it explains away too many problems to list: pageview inaccuracies, self referrals,  wrong time on site, page timing off, erroneous bounces... the list goes on and on.

Learn more about the recommended code setup.

1. Outdated GATC

This is the most common reason for messed up data. The JavaScript tracking code you drop on your page calls the Google mothership and references what can best be described as a tiny web program. This program, much like every other computer program, is constantly being updated and improved. On occasion you will need to update your GATC to take advantage of all the fancy new features of GA. Some of the new features added could be a pivotal element of the overall, or just a drop in the bucket. Either way make sure this is up to date.

Check the Universal Analytics Upgrade Center to make sure you currently have the most updated version.

Viewing all 87 articles
Browse latest View live