The difference between an average email programme and a great one is not a single breakthrough — it is the accumulation of hundreds of small improvements made through systematic testing. A subject line that improves open rates by 3%. A CTA button colour that lifts click rates by 2%. A send time that increases conversions by 5%. Individually, these are marginal. Together, they compound into significantly higher revenue.

A/B testing in Klaviyo is straightforward to set up but surprisingly few brands do it consistently. Most send their emails, check the results, and move on without ever testing what would have happened if they had done something differently. This is guesswork masquerading as strategy.

We run A/B tests on every campaign and flow we manage across our Klaviyo accounts, and the data consistently shows that tested, optimised emails outperform untested ones by 15-30% on key metrics. This guide covers how to set up, run, and interpret A/B tests in Klaviyo for both campaigns and flows.

Why A/B testing matters

A/B testing replaces assumptions with evidence. Instead of debating whether a short subject line works better than a long one, you test it. Instead of guessing the best send time, you measure it. The data tells you what actually works for your specific audience.

The compounding effect is what makes testing so powerful. If you run one test per week and each test improves performance by just 2-3%, after a year you have made 50+ improvements that collectively transform your email programme. This is why brands that test consistently outperform those that optimise sporadically.

What you testWhat improvesRevenue impact
Subject linesOpen ratesMore people see your content
Email content and layoutClick ratesMore people visit your store
CTA buttonsClick-through ratesMore people take action
Send timesEngagement ratesBetter inbox placement and response
Discount amountsConversion ratesHigher revenue per email
Flow timingFlow conversionMore automated revenue
Compounding effect of A/B testing over time
Small improvements from weekly A/B testing compound into significant revenue gains over time.

What to A/B test in Klaviyo

Klaviyo supports A/B testing across campaigns and flows. Here is what you can test in each:

Campaign A/B tests

  • Subject line — the single highest-impact variable for open rates
  • Preview text — the snippet that appears after the subject in the inbox
  • From name — brand name versus founder name versus team member name
  • Email content — different layouts, copy, images, or product selections
  • Send time — morning versus afternoon, weekday versus weekend

Flow A/B tests

  • Email content — different messaging approaches, layouts, or offers
  • Timing — different delay periods between flow steps
  • Number of emails — 2-email sequence versus 3-email sequence
  • Offer type — percentage discount versus free shipping versus no offer
  • Channel — email only versus email plus SMS

Step 1: Set up a campaign A/B test

  1. In Klaviyo, go to Campaigns > Create Campaign
  2. Select your recipient list or segment
  3. Click Create A/B Test
  4. Choose what to test: subject line, from name, or content
  5. Create your two variations (A and B)
  6. Set the test sample size (typically 20-30% of the total audience)
  7. Choose the winning metric (open rate for subject lines, click rate for content)
  8. Set the test duration (2-4 hours for subject lines, 4-8 hours for content)
  9. Schedule or send the campaign

Klaviyo sends variation A and variation B to equal portions of your test sample. After the test duration, it automatically identifies the winner and sends it to the remaining audience.

Subject line test example

ElementVariation AVariation B
Subject line"New arrivals just dropped""[First Name], your new favourites are here"
HypothesisSimple and directPersonalised and curiosity-driven
Winning metricOpen rate
Test duration4 hours
Sample size25% (split equally)
Setting up an A/B test in Klaviyo campaigns
Klaviyo's campaign A/B test sends variations to a sample, then automatically sends the winner to the rest.

Step 2: Set up a flow A/B test

Flow A/B tests work differently from campaign tests. Instead of splitting a single send, they split the flow path so that each recipient is randomly assigned to one variation.

  1. Open the flow you want to test in Klaviyo's flow builder
  2. Drag an A/B Test Split component into the flow
  3. Configure the split percentage (typically 50/50)
  4. Create different paths for each variation
  5. Set the flow to Live
  6. Allow 2-4 weeks for sufficient data to accumulate

Flow test example: cart abandonment timing

  • Path A: First email 1 hour after cart abandonment
  • Path B: First email 4 hours after cart abandonment
  • Metric: Revenue per recipient over 7 days
  • Duration: 3-4 weeks for sufficient data

For more on optimising flows, see our essential Klaviyo flows guide and our flow optimisation guide.

Step 3: Choose the right metric

Choosing the wrong success metric is one of the most common testing mistakes. Here is which metric to use for each test type:

TestingPrimary metricWhy
Subject linesOpen rateSubject lines primarily influence whether people open
Preview textOpen ratePreview text affects the open decision
Email contentClick rateContent quality determines whether people click
CTA buttonsClick rateButton design and copy affect clicking behaviour
Discount offersRevenue per recipientA higher click rate with a larger discount may yield less profit
Send timeOpen rate or click rateTiming affects both opens and subsequent engagement
Flow timingRevenue per recipientThe ultimate measure of flow effectiveness

Always consider revenue per recipient as the ultimate success metric, even when your primary test metric is open rate or click rate. A subject line that gets more opens but fewer purchases is not truly winning.

Step 4: Interpret your results

Statistical significance

Klaviyo indicates when a result is statistically significant — meaning the difference between variations is unlikely to be due to random chance. Do not declare a winner based on small differences that are not statistically significant. A 0.5% difference in open rates on a sample of 500 people is noise, not signal.

Practical significance

Even statistically significant results may not be practically significant. A subject line that improves open rates by 0.3% is statistically real but operationally meaningless. Focus your testing efforts on variables that can produce meaningful improvements — typically 2%+ on open rates or 1%+ on click rates.

Recording your results

Keep a testing log that records every test: what you tested, the hypothesis, the results, whether it was significant, and what you learned. Over time, this log becomes a knowledge base of what works for your specific audience. For more on CRO approaches, see our CRO guide.

Interpreting A/B test results in Klaviyo
Wait for statistical significance before declaring a winner — small differences may be due to random variation.

Step 5: Apply and iterate

Testing is only valuable if you act on the results. Here is the process:

  1. Record the result — what won and by how much
  2. Apply the learning — update your templates, flows, and playbook based on the winner
  3. Generate new hypotheses — what else could you test based on this learning?
  4. Run the next test — maintain a continuous testing cadence

Building a testing roadmap

Prioritise tests by potential impact and ease of implementation. Subject line tests are easy to run and have immediate impact. Content layout tests require more effort but can drive larger improvements. Discount tests are high-impact but require careful margin analysis.

High-impact test ideas

Subject line tests

  • Personalised (with first name) versus non-personalised
  • Question format versus statement format
  • Emoji versus no emoji
  • Short (under 30 characters) versus long (40-60 characters)
  • Urgency-driven versus curiosity-driven
  • Product-specific versus category-level

Content tests

  • Single product focus versus multi-product grid
  • Text-heavy versus image-heavy layout
  • Social proof (reviews, testimonials) versus no social proof
  • Top CTA placement versus bottom CTA placement
  • GIF hero image versus static hero image

Offer tests

  • Percentage discount versus fixed amount discount
  • Free shipping versus percentage discount
  • Discount versus free gift with purchase
  • Immediate discount versus delayed discount (next purchase)

Flow tests

  • Cart abandonment: 1-hour delay versus 4-hour delay
  • Welcome series: 3 emails versus 5 emails
  • Win-back: discount in Email 2 versus Email 3
  • Post-purchase: cross-sell immediately versus after 7 days
High-impact A/B test ideas for Klaviyo
Prioritise tests by potential impact — subject lines and offer types typically have the highest ROI.

Advanced testing strategies

Multivariate testing

Once you have exhausted single-variable tests, consider testing multiple variables simultaneously. For example, test four combinations: short subject + image-heavy content, short subject + text-heavy content, long subject + image-heavy content, long subject + text-heavy content. This requires a larger audience but reveals interaction effects between variables.

Segment-specific testing

What works for one segment may not work for another. Run the same test across different segments (VIP versus new customers, male versus female, high-AOV versus low-AOV) to discover segment-specific preferences. This allows you to tailor your approach to each audience.

Seasonal testing

Customer behaviour changes throughout the year. Subject lines that work in January may not work during Black Friday. Run tests during different seasons to build a nuanced understanding of what drives engagement at different times of year.

Testing automation

Create a systematic testing calendar that ensures you are always running at least one test. Assign specific test types to specific weeks: Week 1 — subject line test, Week 2 — content test, Week 3 — send time test, Week 4 — offer test. This prevents testing from falling off your radar during busy periods.

Common mistakes to avoid

1. Testing too many variables at once

If you change the subject line, the hero image, the CTA text, and the discount all at once, you have no idea which change drove the result. Test one variable at a time. If variation B wins, you know exactly why.

2. Declaring winners too early

Checking results after 30 minutes and declaring a winner is a recipe for false positives. Wait for the full test duration and check for statistical significance before making any conclusions.

3. Not testing at all

The most common mistake is simply not running A/B tests. Every email you send without testing is a missed opportunity to learn something about your audience. Even if a test is inconclusive, the data is valuable.

4. Ignoring losing tests

A test where variation B loses is just as informative as one where it wins. Record what did not work and why you think it failed. These negative results prevent you from repeating the same mistakes.

5. Sample sizes too small

Testing with 200 people per variation produces unreliable results. Ensure each variation reaches at least 1,000 recipients for campaign tests. If your list is too small for meaningful A/B tests, focus on other optimisation strategies until your list grows.

A/B testing mistakes to avoid in Klaviyo
Test one variable at a time, wait for statistical significance, and record every result in your testing log.

A/B testing is the discipline that separates email marketers who guess from those who know. Every test you run adds another data point to your understanding of what your specific audience responds to — and that knowledge compounds over time into a significant competitive advantage.

Start with subject line tests — they are the easiest to set up and have the most immediate impact on open rates. Then expand to content tests, offer tests, and flow tests. Within a few months, you will have a library of insights that informs every email you send.

The brands that test consistently do not just outperform — they outperform by an increasing margin, because every test builds on the last. Start testing today.

Need help setting up a systematic testing programme in Klaviyo? See our Klaviyo services or get in touch for a free account audit.