Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Table of Contents
minLevel1
maxLevel7

What you need

In order to test preezie’s impact you will need to be able to track 2 things:

...

Because preezie won’t be used by all users you need to be able to compare those who use it by those who saw it and did not use it. This difference is your measure of preezie’s impact.

Depending on what goals you have, you can either use a split traffic tool (Google Optimize, VWO, Optimizely etc.) or track at a user level over date periods.

...

Why we need to

...

segment by usage

A simple comparison is using an AB test to test the impact of showing Afterpay/Klarna in your checkout. Although this feature is available to all checkout users, we expect it will only impact a % of them.

...

This is the same for preezie, depending on where it is shown a % of your users won’t be impacted at all, so you need to analyse the behaviour only of those who do engage with it.

...

How we define usage

A preezie user is one who clicks at least once on a question. You can also track by those who ‘complete’ a preezie journey, i.e. saw their product recommendations.

To do this you will need to ensure your analytics/testing tool can track preezie views and click events. If you want to track preezie clicks in your Google Analytis (or other event based tool) then use this guide:

Data Layer for Google Analytics events

...

Using a split traffic AB test tool

...

Here we’ll track those who saw it, those who clicked and compare against the inverse of both. Our main goal of exit intents are to keep users on the website, so our goals are now:

Split traffic

Page trigger

Segment

Goal A: Exit rate

Sessions per user

Bounce rateGoal B: New user pages/session

Goal C: User conversion rate (sales per user)

50% Test

any
(shown based on exit intent behaviour)

25% non-did not see preezie users

19%30%

2.1

423%.2%

10% saw preezie users

7%

3.2

0% (preezie counts as a significant event)and did not click it

20%

2.3

4.5%

15% clicked on preezie

6%

4.3

6.8%

50% Control

any

50% non-preezie users

26%

2.6

25%(never shown)

50% no preezie loaded

30%

2.2

4.1%

Here you can compare your control bucket with those who did not see it to ensure the behaviour is the same across buckets (see article on A/A testing).

However, because you have a bucket of users who do not see it you do not need to use an AB traffic split tool to get this result. Just run the test until you are comfortable the traffic volumes and difference in goal metrics are significant enough to compare, no preezie vs preezie clicked (25% vs 15%) and no preezie click vs preezie click (10% vs 15%).

Tip

Tip: Make sure your primary goals are reflective of the behaviour preezie will drive. For example you cannot expect same session conversion rate to be driven immediately by an exit intent pop up, instead you can expect to keep more users engaged on your website and hopefully convert at a later date (e.g. by ad retargetting, email incentives etc.) .

Just like a retail store visit, if they have a positive first experience then they’ll come back!