How to Create Calculated Metrics in Google Data Studio with Blended Data Source

published by on 5th November 2018 under Google Analytics, Industry News

Or How to Create Bespoke Conversion Rates

Last week, GDS released a new feature called chart-specific calculated fields.
I decided to write a quick post to describe this as the documentation is pretty scarce…

The main interest of this feature is that it works with blended data sources. Even if you can’t create a calculated field at the data source level, this is pretty useful to for instance create conversion rates metrics of user or session-based segment XYZ versus all users or sessions.
Thus you can get conversion rates for anything, without having to create a goal in Google Analytics. For instance, you can see
- what’s the % of total sessions with event XYZ
- what’s the % of total sessions with view of page XYZ
- etc.

So here is how to create this from a Google Analytics data source:

1. Create your segment in GA

For instance, Sessions that include a specific event (below a click to Flight Information – that’s from an airline site)

GA segment

2. Create the blended data source

Below, I’ve blended a GA view with the same view but applying the newly created segment.

I’ve added the Sessions metric for both of them but renamed the sessions looking at the specific segment to avoid any confusion later on.

I’ve also added the Country dimension as a joint key, but this is optional.

So you should end up with at least 2 metrics in your blended data source; All Sessions and Sessions from Sessions including event XYZ:

Note that the data source without segment must be the first one (in the left) as it will take precedence over the data source using segment in the right. This is critical when/if using join key(s).

blended data source

3. Create a new component (chart)

create chart

4. Create the calculated metric

Click to select a metric, then CREATE FIELD:

click create field

Type the name of your new calculated metric and enter your function, then select the Type (Percent below as we are looking at a conversion rate):

create calculated metric

And voila!

chart 1

5. Copy to re-use

The calculated field is specific to the chart but you can just copy-paste the initial chart to avoid having to create the calculated metric again.

chart 2

Note: When using a table, don’t enable the “summary row” as it will sum up the conversion rates… (I believe this is one of the limitation of having the calculated metric defined at the component level and not at the blended data source one).

Hope this is helpful and let us know how you use this new chart-specific calculated fields :-)

Related posts

Finding Method in Research Madness

published by on 2nd October 2018 under Conversion (CRO), Digital Strategy

pexels-photo-908295

It’s never a bad time to conduct user research. Whether you work on an ecommerce, SaaS or lead generation site – user research is key to understand your users, increase conversions and create an optimal User Experience.

 

As a UX and Conversion Specialist at FIRST Digital, I frequently run user research programs for my clients and I’ve compiled my top tips to create a structured research program.

 

1. Start with questions

One of the greatest pitfalls when running user research is diving in head first without asking key questions. If you lead with questions, then there is always a purpose for your research.

 

Start with a site run through and list out the key business, industry and customer questions at each stage of the user journey. Breaking it down into these areas gives a clear structure to your research program.

 

In the following example, some key questions for a product pricing page could include:

 

  • Business question: Is there ability to change our price offering?

  • Industry question: How are our competitors displaying their prices?

  • User question: Do users find our price offering appealing?

 

Some of your questions might easily be answered with an analytics review or previous research insights. Others will need more in depth analysis such as surveys, competitor analysis or user testing. The end result of this exercise will be a list of questions grouped under these 3 main areas and prioritized based on business needs.

 

2. Choose your research method

There are a range of user research methods and tools available at your fingertips – again it will all depend on your budget, resource available and business needs. If you don’t have the budget to invest in the latest shiny new tool – there are plenty of other options.

 

Back to the product pricing page example:

 

  • Business question: Is there scope to change our price offering? This could be answered by conducting a key stakeholder focus group

  • Industry question: How are our competitors displaying their prices? This could be answered with a competitor analysis and price comparison

  • User question: Do users find our price offering appealing? This could be answered by a range of research methods such as on site surveys, first click tests or user testing and most tool providers have a range of budget friendly options

 

3. Plan your research

“Fail to plan. Plan to fail” – let’s face it, we’ve all been there. This is equally as important when it comes to user research.

 

Having a simple research plan which outlines the goal of the research piece and key details, will not only act a guide but also as a point of reference which can easily be shared between stakeholders.

 

Remember it doesn’t have to be fancy – just a document which outlines the following:

 

  • The research question(s)

  • Goal of the research piece

  • Research method and/or tool

  • Timescales i.e. how long the research piece will take or how long to run it for

  • Key details, tasks or demographics that are relevant for the research piece

 

4. Run it and iterate

So you’ve narrowed down your questions, chosen your first research piece and made a plan. Now time to run it and build momentum.

‘But how long should I run it for?’ I hear you say. The honest answer is – it depends, especially when it comes to qualitative research. For example, as a rule of thumb the magic number of user testers is 5 and for surveys a minimum sample of 200 is advised. Timescales will differ for each site according to levels of traffic – but using these as a reference point will help.

 

Remember the importance of iteration – if a survey is getting a low response rate, switch it to another site area or change the question. If your heatmap or analytics tool isn’t giving you the insight you had hoped for – trial a different tool or method to answer your research question. As long as you’re learning and iterating – then any insight is better than no insight!

 

In part two of this blog piece we will explain what to do with your new insights and how to create stories that every stakeholder will love.

 

Measuring page load speed for single page apps

published by on 24th September 2018 under Google Analytics

Single page apps are always a battle for a robust Google Analytics implementation. From correct page titles, rogue referral problems, to when to fire page views, nothing is inherently simple. To add onto the list of complications with single page apps is the fact that Google Analytics will not provide page timings (once loaded between internal pages) for SPA’s. This includes if you increase the site speed sample rate to 100. This is because Google Analytics calculates the page timings using the Navigation Timing API.

For example, DOM loaded would be:

$(document).ready(console.log((Date.now() - 
    performance.timing.domComplete)/1000))

To over come this problem, you will need to use custom metrics. The solution has three steps.

1) Set up a custom metric in GA.

Go to Admin > Property > Custom Definitions > Custom Metric.

Create a new Custom Metric, with the scope of Hit and the formatting type of time. Note: Specify time in seconds, but it appears as hh:mm:ss in your reports.

2) Set up a timer.

You will need to capture the time when you want to start the measurement of page load time.

An example solution to this might be by decorating all of your internal links. For example in Google Tag Manager we could set up a Custom HTML tag:

<script>
$('a[href*="firstdigital.co.nz"]').click(function(){
  time1 = Date.now()
});
</script>

3) Send the time eclipsed (in sec) to Google Analytics on the virtual pageview event.

When the virtual pageview event occurs (which triggers your virtual pageviews), retrieve the difference between the current time (Date.now()) and the time which the timer was started (time1).

Using Google Tag Manager, a Custom JavaScript variable (e.g. SPA load time) can be created as below:

function(){
  return (Date.now() - time1)/1000
}

This value then needs to be sent with the pageview, against the custom metric index set up in step1.

SPA pageview example

Using the custom metric along with calculated metrics (e.g. {{virtualPageTimings}}/{{pageViews}}, you will be able to calculate your average virtual page timings.

Bonus:

To make the measurement more accurate, set up a secondary custom metric to count the number of virtual pageviews. This will make sure that landing pageviews are not taken into consideration.

To do this, create a custom metric with the scope hit and the formatting integer.

Then with every virtual pageview, send the value 1 against the custom metric index. E.g:

SPA page load speed 2

This allows for the calculated metric:

2018-09-24_9-48-37

 

Using this calculated metric will then give you a good idea of how long a SPA page took to load, from the time which a user clicked on a link through to that page.

This is just one aspect of what is needed to be taken into consideration when working with single page apps. Feel free to reach out if you require any assistance tracking your SPA.

Ramp Up Results with Research Based Conversion Optimisation

published by on 19th June 2018 under Conversion (CRO), General

The true benefit of Conversion Rate Optimisation (CRO) is all about gaining knowledge and optimising every single aspect of your product and customer journey. Having an ongoing structured CRO process that is part of the DNA of your organisation can greatly magnify the benefits of all other marketing activities.

The best way to truly optimize your funnel and see real growth via CRO is by ensuring it has sufficient scope in your marketing strategy and treating it as a means to optimize every part of your business.

 

CRO-results

A successful CRO program allows you to develop a deep understanding of what challenges your customers face, how to talk to them and what drives their decision-making process. This means not only more subscriptions, leads or sales but it can optimize your ad campaigns, your acquisition funnels, retention emails, billing processes and even how your products are shipped.

 

Three Things You Are Doing Wrong in CRO

 

There are many common fallacies that we encounter all the time when clients engage with FIRST Digital, often times clients have previously run CRO and their results have been mixed, and they are at a loss to explain why. Usually, they are making one or more of the following errors;

 

No. 1: You are missing the point of a structured CRO program.

You simply treat conversion optimization as a bunch of random tactics to get more signups, downloads or sales.

 

No. 2: You don’t have a CRO strategy.

You follow what is referred to as ‘best practices’ from blog posts, listen to random advice and make guesses as to what should be tested. Best practice is what people do until they discover something better – then that becomes best practice. Be the one to discover a better way.

 

No. 3: You are running meaningless tests.

Using guesswork and running variations on single elements on a page and expecting them to yield the desired results. Big results require more than single element changes.

If you are thinking “Guilty as charged your honour…” we’ve all been there, but you can make positive change, and the sooner the better – potential revenue is leaking out of your funnel right now.

 

CRO conversion optimization

 

Solution – Create a structured CRO program

 

There are a number of steps that should be performed well before launching tests, and that should be executed by all organisations no matter what size they are, or how much traffic their site generates.

 

Step 1: In-depth Qualitative & Quantitative Research

Focus on discovering why leaks exist – what’s preventing your potential customers from taking the next step, analyzing competitors, building customer profiles, segmenting your audience, doing customer surveys and much more. This is the only way to consistently deliver better results. The insights you gain from this research is priceless and will start shifting things within the company way before you launch any experiments.

Data analysis around current behavior, pinpointing problem areas. It’s important to look at heat maps and recordings to see where customers are frustrated, where your current funnel needs optimizing. This step is crucial for any CRO process and requires a great deal of curiosity and in-depth analysis.

 

Step 2: Hypothesize

Based on the previous steps, at this point, you come up with possible solutions and hypotheses on how they can be solved. In this stage, you define how you will optimize and fix these issues. Much like a legal trial you use statistics to prove that the variations you are proposing are ‘guilty as charged’ of causing the relative uplift you observe.

 

Step 3: Prioritize

Once you’ve found areas of opportunity you need to then determine how much of an impact changing them would have on your bottom line and how many resources you will require to build the test. This will help you decide if it’s worth your time and effort.

 

Step 4: Launch Test Program

Only once you’ve completed all the previous steps do you then launch tests to validate your hypotheses, and ongoing research forms a virtuous cycle of test ideas and more potential big wins.

Your goal is to gain as much insight, knowledge and data about your customers so you can truly optimize your site around their journey.

 

 That Sounds Great – How do I get Started?

 

To learn more about getting a CRO program underway, research, CRO training & the best tools to use, or anything else discussed in this article – get in touch

 

 

 

Related posts

GDPR: Scope, Consent & Impact on (Analytics) Tags

published by on 9th May 2018 under Digital Strategy, Digital Trends, General, etc

It’s coming and let’s face it, whilst we’ve been warned since 2016, very few started thinking about it more than 6 months ago if not later…There are a lot of resources out there trying to translate what GDPR means and how it will impact businesses’ organisation and data management but very few manage to give clear and practical guidance. This post is an attempt to provide concrete answers about the user consent mechanism and its impact on web analytics. Of course, the usual caveat: “I’m not a lawyer and you should seek legal advice to assess your own situation and blablabla”….But even if you seek legal advice, you may end up knowing more about the GDPR than your legal adviser, especially here in NZ!

 

 Step 1: Do You Need To Comply With GDPR?

There are two conditions to meet to fall under the GDPR:

1- Do you actively target EU-residents?
2- Do you collect personal data, including pseudonymous data?

If you answer “no” to the first or both questions, then you’re in luck. Otherwise, sorry but you’ll need to go under the hood of you data collection and management processes and adjust them…

 

1. Do you actively target EU-residents?

There has been a misconception that if you don’t process or collect data inside EU, you are not affected by GDPR. This is partially true (or wrong, depending on your mood…). GDPR applies to any sites that collect data about data subjects that are physically in the Union, regardless of the data controllers, (i.e. the site) and/or processors’ locations. Given that you can’t really control who is visiting your site, any sites are actually virtually impacted by GDPR. However the Regulation will look at how actively the site is targeting EU-residents to determine if it falls under the GDPR.

Recital 23: “In order to determine whether such a controller or processor is offering goods or services to data subjects who are in the Union, it should be ascertained whether it is apparent that the controller or processor envisages offering services to data subjects in one or more Member States in the Union. [...] factors such as the use of a language or a currency generally used in one or more Member States with the possibility of ordering goods and services in that other language, or the mentioning of customers or users who are in the Union, may make it apparent that the controller envisages offering goods or services to data subjects in the Union.”

So for instance, if you are offering payments in Euros and/or specific shipping rates to European countries (including UK), you explicitly target your offer to EU-residents. Therefore you must abide by the GDPR. Similarly, if you have advertising campaigns specifically targeting a member State of the Union.

The same applies if the site doesn’t sell anything but still monitors users’ behaviour (see recital 24).

 

2. Do you collect personal data, including pseudonomised?

You may think that you are safe because you don’t collect personal data but the GDPR has a rather broad definition of what constitutes personal data:

Recital 26: ”Personal data which have undergone pseudonymisation, which could be attributed to a natural person by the use of additional information should be considered to be information on an identifiable natural person. [...] To ascertain whether means are reasonably likely to be used to identify the natural person, account should be taken of all objective factors, such as the costs of and the amount of time required for identification, taking into consideration the available technology at the time of the processing and technological developments.The principles of data protection should therefore not apply to anonymous information, namely information which does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable.”

Recital 30: Natural persons may be associated with online identifiers provided by their devices, applications, tools and protocols, such as internet protocol addresses, cookie identifiers or other identifiers such as radio frequency identification tags. This may leave traces which, in particular when combined with unique identifiers and other information received by the servers, may be used to create profiles of the natural persons and identify them

This means that IP addresses and Client IDs (created by the _ga Google Analytics cookie) are considered as personal data by extension, even if you don’t have direct access to these data as they are not surfacing in GA reports.

One way to address this is to mask IP addresses using the IP Anonymization feature. This would make the identification of a specific subject much harder. Note that you will need to either remove or adjust your current IP-based filters in GA accordingly.
Then for users who do not consent to get their data collected for Analytics purpose, you can still track their session by setting the _ga cookie to expire at the end of the session (or even no cookie at all), which makes their identification impossible (the Client ID becomes truly anonymysed), unless they provide personal pseudonmysed data, like a User ID, but in this case you will need to ask for their consent for Personalisation purpose anyway. And remember that GA doesn’t allow you to collect any actual personal data (e.g. name, email, phone number, address, etc.).

By now, you should have a better idea on whether or not you need to comply with the GDPR.
If so, you will have to get the consent from users.

 

Step 2: Create a Consent Mechanism

During my research, I haven’t been able to see a consent modal design, which would fully answer GDPR requirements, i.e.:

  • Be granular by data collection purpose (I’ve seen an example where all data processors were listed, which would end up as a very long list for many sites)
  • Offer free choice, without the opt-in/opt-out button being preselected
  • Explain in plain English what the data processing is for, by purpose
  • And most importantly, address the inter-dependency issue between some data processing purposes

Piwik Pro and Tealium both offer built-in consent management solutions but they fail to fully address these points.

Tealium Consent Manager:

Consent Preferences Manager from Tealium

Tealium proposes a highly customisable consent manager, but not sure if you can customise each tag’s settings according to the consent. It looks like the analytics tags would be all disabled if the consent for Analytics is not given, which would be unnecessary if you can configure them so they only collect anonymised data. And this doesn’t really address the inter-dependency issue (not that there is any between the purposes shown in the example above).
As a side note, I would recommend to ask for consent for the Social Media purpose only when users first click to share on social media, and provide an option to save their selection (so create a specific cookie for social sharing consent).

 Piwik Pro Consent Manager:

Consent form from Piwik Pro

I like the “Agree to all” button but not sure how it addresses dependencies between data processing purposes. And I imagine the saving option should not be available if at least the Analytics purpose is not enabled since by definition you are not allowed to store any data if no consent is given.

Adobe Dynamic Tag Management (soon to be revamped to “Launch”) doesn’t provide a ready-to-use consent management solution but you can add extensions such as TrustArc’s or Evidon’s (not free though).

As for Google Tag Manager, there is no built-in consent manager either but you could easily inject some custom HTML tag as described in this post from AnalyticsMania.

Google has also updated its information portal for publishers and advertisers, suggesting different solutions to add a consent function to your site.

Still, the best solution would be to create its own customised consent modal as no situation is the same…

Below is my attempt to create a consent modal for a first-time user on landing page:

FIRST consent modal
“More details” link would redirect to the privacy policy page describing the purpose in more details and listing all data processors. To address the inter-dependencies between purposes, the Analytics and Personalisation purposes would be enabled if the Personalisation and Advertising purpose are enabled, respectively.

 

Step 3: Adjust Your Data Collection According to User’s Consent Level

The best article I’ve read so far about this is the excellent post by Jente De Ridder from Humix, who proposes a practical solution to make GA GDPR compliant via GTM. It’s logical and well written and probably answers most of web analysts questions.

The diagram I propose below is largely inspired by Jente’s solution, with a few additional notes on the granular consent mechanism.
For instance, one requirement from GDPR is to “be able to demonstrate that the data subject has given consent to the processing operation” (recital 42). Therefore you can send an event to GA to pass the consent level. This event will then be very useful to identify “new users” from consent level 0 (by definition all users with consnent level 0 are new users) from other “”usual” new users.

Additionally in the GA UI, you will need to adjust your Data Retention period to a determined period as required by the GDPR:

Recital 39: “The personal data should be adequate, relevant and limited to what is necessary for the purposes for which they are processed. This requires, in particular, ensuring that the period for which the personal data are stored is limited to a strict minimum.”

I hope this can help visualising the challenge and visualising one solution more easily, if nothing else.
 

GDPR consent mechanism

 

Closing Thoughts

I believe we will certainly see a decrease in the audiences size of the (remarketing) advertising campaigns but with good wording, we can end up with a win-win situation: the user understands why we collect data and why it may be beneficial for him/her while the brand shows its care for data privacy respect and may improve advertising campaigns with a better qualified audience and smarter incentives.

Ultimately, even if you are not directly impacted by GDPR (e.g. you don’t target EU-residents in the Union), I would still encourage you to proactively think about how you collect, store and share users data (especially if EU-citizens constitutes a significant share of your audience – think about immigration-related sites, job listings, or real estate sites, etc.) as New Zealand will most likely cooperate with the EU and update its privacy regulation to get closer to the GDPR. Showing that you have a strong privacy policy can also be perceived as a competitive advantage for users, who are more and more vigilant about how their data is being collected, used and shared…

Some Resources: