Who can use this feature?
A/B tests allow you to deliver experiences to a random subset of users and compare the results. A/B tests can be an effective strategy for determining if your in-app experiences are driving product adoption and user engagement. They unlock the ability to compare the behavior and progress of users who received your experiences to users who didn't.
Run an A/B test
A/B tests can be run for any tour or announcement that is published on your site.
- From within the Save & Publish section of your experience, enable the Run an A/B Test option.
- Save an publish your experience.
Once published, your experience will begin to randomly assign users to Group A, who will see the experience, and Group B who won't. You can leave the test running for as long as you'd like, but we typically recommend 2 weeks in order to collect enough data to draw strong conclusions. When you're ready to end the test, simply toggle the switch off and save your experience.
Our recommendations for running a successful A/B test
- Select an experience that is meant to enhance a user's experience but not one that is vital to the usability of your platform.
Ex. Announcing an underused feature that users seem to be missing
- Set a goal for the experience that you're A/B testing. Our code-free goal options make setting a goal as simple as selecting a button that you want users to click or a page for them to reach after viewing the experience.
- Keep the test running for roughly 2 weeks. It can be difficult to accurately interpret the results of A/B tests without enough data.
Analyze the results of an A/B test
Once the A/B test is running, you'll begin to see data from it populate on the experience detail page of the Builder dashboard. Select the experience that's running the test and you should see an analytics dashboard.
On this analytics dashboard you'll be able to see a clear separation between the behavior of users who viewed your experience (Group A) and users who didn't (Group B). The available metrics include:
- Total Views: The total number of times the experience was viewed by users in each group.
- Total Completions | Took Same First Action: The number of users in Group A who completed the experience from start to finish versus the number of users in Group B who took the same first action as what the experience would normally point them towards.
- Time Spent Viewing: The average time that users in Group A took to complete or exit the experience versus the average time that users in Group B took before performing a first action.
If a goal is set for your experience you gain access to some additional metrics. These metrics can be very helpful in determining if your experience is effective at driving users towards a key activity. They include:
- Total Users: The total number of users in each group who viewed the experience.
- Total Goal Completions: The total number of users in each group who completed the goal associated with the experience.
- Time To Complete Goal: The average time that users in each group took to complete the goal.