Navattic Internal Experiment: A/B Testing Overview vs Segmented Demos

6 min read

If your product serves multiple personas or use cases, deciding which interactive demo to feature on your homepage can be tricky.

At Navattic, we faced the same challenge. Last year, I ran an internal test on our main homepage demo to see what worked best.

We compared our current overview demo with a more personalized version that lets visitors choose whether they want to explore a sales, marketing, or product persona-based demo.

Compared to an overview demo, with persona-specific demos we saw:

  • +45% lift in folks who submitted our book a demo form
  • 6.3x improvement in the number of MQLs
  • Roughly 2x lift in demo completion

We wanted to see if persona-specific demos still outperformed overview demos, but this year we reran the experiment using our new A/B testing functionality.

Below, I break down exactly how we designed each experiment, the results we got, and how to replicate A/B tests like these yourself.

A/B test setup: Overview vs. persona segmented demos

For this experiment, we conducted two different A/B tests.

We wanted to test multiple formats for both segmented demos (by persona) and overview demos to see if different variations of each would impact CTR.

A/B Test #1: Comparing persona segmented demos

Presented visitors with the option to choose their persona upfront at the beginning of the demo. Tested two formats:

  • List of roles (sales, marketing, product)
  • Buttons to choose your role

A/B Test #2: Comparing overview demos

Overview demo highlighting the main value props and use cases for Navattic. Tested two formats:

  • Shorter 13 step overview demo
  • Longer demo with a checklist (that showed off multiple Navattic features)

We ran these experiments for roughly 3 weeks total:

  • Each A/B test went for 1.5 weeks
  • Each demo in the test received between 850 to 1,130 demo visitors

A/B test results

We analyzed two things:

  • Which version in each test performed better
  • How segmented demos, on average, compare to overview demos

We measured success by demo CTR, which for us is a click on our CTA to either book a demo or sign up for our free plan.

Segmented demos outperformed both types of overview demos:

  • 25% average CTR for segmented demos
  • 18% CTR for the short overview
  • 15% CTR for the longer checklist
    • Overall, we found segmented demos had a:
      • 33% better click-through rates than shorter overview demos
      • 50% better click-through rates than the longer checklist demo

Between the two variations of segmented demos

We found that the difference in segmented styling variations was inconclusive. Both groups had similar engagement and high click-through rates, hovering around 25%.

According to our State of the Interactive Product Demo 2025, a 25% CTR would get us close to the top 10% of interactive demos built on Navattic

State of the Interactive Product Demo Top 10%
Top 10% of interactive demo metrics

Between the short overview and the long checklist demo

We found that shorter demos had a slightly higher click-through rate:

  • Shorter overview demo had a ~18% CTR
  • Longer checklist demo had a ~15% CTR

Steps to replicate an interactive demo A/B test

Step #1: Create two variations of demos to test

Before you can start your testing, you have to decide what you want to test (and build the corresponding interactive demos).

Based on the results of our experiment, we recommend pitting a short overview demo against one segmented by:

  • Persona
  • Use case or jobs to be done
  • Product line
  • Industry

Not quite sure how to segment your demos?

Check out: Implementing Interactive Demos for Multiple Product Lines.

Step #2: Set up an A/B test in Navattic

Once you’ve created the two demos you want to test:

1. Head to Settings → Demos (under Overview) → A/B Tests.

2. Set one demo as “A” and the other as “B.” Then, choose the traffic distribution you want for each demo.

Below is an example of what a test would look like:

Navattic interactive demo A/B test split

3. Let your experiment run, and track demo engagement and CTA clicks to see which demo brings in better leads.

If you need help, follow our step-by-step interactive walkthrough or reach out to success@navattic.com for assistance.

Step 3: Use Navattic analytics to measure CTR

Navattic Analytics automatically compares the two demos against each other in terms of performance, including click-through rate.

If you are trying to spot any other trends, you can use Analytics to measure the difference between the two demos in terms of:

  • Unique visitors (total visitors to step 1 of your demo)
  • Engaged visitors (unique visitors who get past at least 1 step in your demo)
  • CTA clicks (total number of clicks on a CTA in your demo)
  • Steps viewed (total number of demo steps across your workspace)
  • Time spent (how long average user spends on a browser with your demo)

Note: While we saw that segmented persona demos had higher CTRs, we recommend trying this A/B test for your own audience.

Other A/B tests for interactive demos

Gated vs. ungated demos. When you gate your demos, you know exactly who your leads are and their level of intent. But not everyone is willing to give up their email, and ungated demos tend to get more engagement.

Read more about the pros and cons of each approach here.

Different modal intro copy. The words you use in your demo can make a big difference, hooking people in and keeping them engaged.

Here are some great tips for interactive demo copywriting.

Alternate CTAs. You might give visitors options to learn more — either in another demo or in your demo library. Or, you might want to push them toward a free trial.

Here’s our advice on how to end an interactive demo.

Curious about other experiments we’ve run at Navattic?

Read about how we retargeted leads on LinkedIn and nearly 7X’d our total pipeline ROI.

Share

Next Post

Build demos
that delight.