Klaviyo & EmailMarch 26, 2026

How to A/B Test in Klaviyo Without Wasting 3 Months

Most Klaviyo A/B tests are useless. Here's what to test first, how to get valid results fast, and the mistakes that waste months of your time.

Mark Cijo

Mark Cijo

Founder, GOSH Digital

How to A/B Test in Klaviyo Without Wasting 3 Months

How to A/B Test in Klaviyo Without Wasting 3 Months

Here's a confession: most A/B tests in Klaviyo are a complete waste of time. Not because testing doesn't work — it does. But because brands test the wrong things, with too-small sample sizes, for too little time, and then make decisions based on noise instead of signal.

We run 30-50 A/B tests per month across 150+ Klaviyo accounts at GOSH Digital. We've learned — often the hard way — what's worth testing, what's a waste of time, and how to get statistically valid results without spending three months staring at a dashboard.

Here's the playbook.

The A/B Testing Hierarchy: What to Test First

Not all tests are equal. Some move revenue. Most don't. Here's the order we test in, ranked by impact:

Tier 1: High Impact (Test These Immediately)

1. Subject Lines

Subject lines determine whether your email gets opened or ignored. A 10% lift in open rate cascades into more clicks, more conversions, more revenue. It's the single highest-leverage test you can run.

What to test:

  • Personalization: "[First name], your cart is waiting" vs. "Your cart is waiting"
  • Length: Short (4-6 words) vs. long (8-12 words)
  • Curiosity vs. Direct: "You won't believe these results" vs. "New arrivals: 20% off today"
  • Emoji vs. no emoji: "Your order is ready" vs. "Your order is ready"
  • Question vs. statement: "Ready for your next order?" vs. "Your next order is inside"

Our findings across 150+ accounts:

  • Personalized subject lines (with first name) lift open rates 5-15% in flows, less in campaigns
  • Short subject lines (under 40 characters) outperform long ones 65% of the time
  • Emojis increase open rates in about 55% of tests — it's brand-dependent
  • Questions outperform statements for re-engagement emails, but not for transactional ones

2. Offer/Discount in Welcome Flow

Testing the popup offer (and corresponding welcome email) has massive downstream effects because it impacts every person who enters your funnel.

What to test:

  • 10% off vs. 15% off vs. free shipping
  • Percentage discount vs. dollar amount ("$10 off" vs. "15% off" — depends on AOV)
  • Discount vs. free gift ("Free sample with your first order")
  • Single-use coupon vs. no expiration vs. 7-day expiration

Our findings:

  • For AOV under $75: Free shipping often beats percentage discounts
  • For AOV $75-$150: 10-15% off tends to win
  • For AOV over $150: Dollar-off ($20 off) often outperforms percentages (people perceive $20 off a $200 item as better than 10% off, even though 10% = $20)
  • Coupon expiration increases urgency. 7-day expiration beats no expiration by 15-25% in conversion rate.

3. Send Time

When you send matters more than most people think. A 2-hour difference in send time can swing open rates by 10-20%.

What to test:

  • Morning (8-10 AM) vs. afternoon (1-3 PM) vs. evening (7-9 PM)
  • Weekday vs. weekend for campaigns
  • Time zone optimization (send at 10 AM in each recipient's time zone vs. single blast)

Our findings:

  • Tuesday-Thursday at 10 AM local time is the baseline winner for most brands
  • But the real answer depends on your audience. We've had supplement brands where 7 AM wins (people check email with their morning routine) and fashion brands where 8 PM wins (evening browsing).
  • Klaviyo's Smart Send Time feature uses machine learning to optimize per-recipient. We've seen it outperform fixed times by 5-12%. Turn it on: Campaign > Sending Options > Smart Send Time.

Tier 2: Medium Impact (Test After Tier 1)

4. Email Design/Layout

Once you've optimized subject lines and offers, test the email itself.

What to test:

  • Single-column vs. multi-column layout
  • Number of products shown (2 vs. 4 vs. 6)
  • Image-heavy vs. text-heavy
  • Single CTA button vs. multiple CTAs
  • Button color and text ("Shop Now" vs. "Get Yours" vs. "Claim Your Discount")
  • Plain text vs. branded HTML (especially for founder messages and win-back emails)

Our findings:

  • Single CTA consistently outperforms multiple CTAs in flow emails (people in an automated journey need clear direction)
  • For campaigns, multiple CTAs work because you're showing variety
  • "Shop Now" beats "Learn More" in click rate by 15-25% for product emails (because it matches intent)
  • Plain text from "Mark at [Brand]" outperforms branded HTML in win-back and personal emails by 20-40% in open rate (it feels personal, not marketing)

5. Flow Email Count and Timing

This is specific to flows. How many emails should be in the flow, and how far apart should they be?

What to test:

  • 3-email vs. 5-email welcome flow
  • 1-hour vs. 4-hour abandoned cart first email
  • 24-hour vs. 48-hour delays between flow emails
  • Adding or removing an SMS step

Our findings:

  • Welcome: 5 emails outperforms 3 (we covered this in the welcome flow guide)
  • Abandoned cart: 1-hour first email beats 4-hour (see the abandoned cart guide)
  • Flow delays: 24-hour gaps work for most flows. 48-hour gaps work for post-purchase where you need delivery time.

Tier 3: Low Impact (Don't Test Until Everything Else Is Optimized)

6. From Name

"[Brand Name]" vs. "Mark from [Brand]" vs. "[Brand] Team." Yes, this matters. But the difference is usually 2-5% in open rate. Test it, but after the big stuff.

7. Preview Text

The text that appears after the subject line in the inbox. It's worth testing, but the impact is marginal — usually 1-3% open rate difference.

8. Footer Content

Social icons, unsubscribe text, address — nobody's testing footer content. And that's fine. It doesn't move revenue.

How to Set Up A/B Tests in Klaviyo

For Campaigns

  1. Go to Campaigns > Create Campaign
  2. Set up your campaign as normal (audience, content)
  3. Click "Create A/B Test" at the top
  4. Choose what to test: Subject Line, From Name/Address, Content, or Send Time
  5. Set the split: We recommend 50/50 for subject line tests on lists over 10,000. For smaller lists, 50/50 is the only option that gives you meaningful data.
  6. Set the winning metric: Open Rate for subject line tests, Click Rate for content tests, Revenue for offer tests
  7. Set the test duration: We use 4 hours minimum (24 hours for revenue-based tests)
  8. Send

Klaviyo's "auto-winner" feature: Klaviyo can automatically send the winning variation to the remaining list after the test period. We recommend this for subject line tests (winner based on open rate) but NOT for revenue-based tests (revenue takes longer to attribute accurately).

For Flows

  1. Go to the flow you want to test
  2. On the canvas, click "Add Action" and choose "Conditional Split"
  3. Select "A/B Split" instead of a condition-based split
  4. Set the split (50/50 is standard)
  5. Build the two variations (different emails, different timing, etc.)
  6. Let it run until you reach statistical significance

Flow A/B tests take longer because flow entries happen gradually (one person at a time, as they trigger the flow). For a welcome flow that gets 200 new subscribers per day, you'll need 2-3 weeks to get 1,000 entries per variation. For an abandoned cart flow with 50 triggers per day, you'll need 6+ weeks.

Statistical Significance: The Part Everyone Gets Wrong

Here's where A/B testing goes off the rails for most brands. They run a test for 3 days, see that Variation A has a 25% open rate and Variation B has a 27% open rate, and declare B the winner. That's not how statistics work. That's how you make random decisions and think you're being data-driven.

What Statistical Significance Actually Means

Statistical significance tells you the probability that the difference between two variations is real (not random noise). The standard threshold is 95% confidence — meaning there's a 95% chance the difference is real and only a 5% chance it's random.

How Many People You Need

Here's a rough guide:

| Expected Lift | Sample Size Per Variation | |---|---| | 1-2% (small) | 10,000+ | | 3-5% (medium) | 3,000-5,000 | | 5-10% (large) | 1,000-2,000 | | 10%+ (huge) | 500-1,000 |

If your test has 200 people per variation and you see a 2% difference, that tells you nothing. You need thousands of recipients per variation to detect small differences.

The Practical Rule

For most eCommerce brands sending to lists of 10,000-50,000:

  • Subject line A/B tests: Run for 4-24 hours with at least 5,000 recipients per variation before declaring a winner
  • Content/offer A/B tests: Run for 24-72 hours with at least 2,000 recipients per variation
  • Flow A/B tests: Run for 2-4 weeks minimum with at least 1,000 entries per variation

Klaviyo doesn't show statistical significance by default. You need to look at the numbers and apply judgment. A good rule of thumb: if the difference between variations is less than 10% relative (e.g., 25% vs. 27.5% open rate), you probably need more data. If the difference is 20%+ relative (e.g., 25% vs. 30% open rate), it's likely real — but still verify with adequate sample size.

Free Significance Calculator

Use this: https://www.evanmiller.org/ab-testing/chi-squared.html

Plug in your numbers:

  • Control visitors = recipients of Variation A
  • Control conversions = opens/clicks/purchases of Variation A
  • Treatment visitors = recipients of Variation B
  • Treatment conversions = opens/clicks/purchases of Variation B

It'll tell you the confidence level. Don't make decisions below 90% confidence. Ideally, wait for 95%.

The 5 Most Common A/B Testing Mistakes

Mistake 1: Testing Too Many Things at Once

If you're testing subject line AND send time AND email design simultaneously, you have no idea which variable caused the difference. Test one variable at a time. Always.

Mistake 2: Calling Winners Too Early

We see this constantly. A test has 300 recipients per variation and a 2% open rate difference, and the brand implements the "winner." That's random noise. Be patient. Wait for statistical significance.

Mistake 3: Testing Things That Don't Matter

Button color is not going to 2x your revenue. Neither is footer text or the exact wording of your pre-header. Focus on the Tier 1 tests first. You can test button colors after you've optimized your subject lines, offers, and send times.

Mistake 4: Not Testing Flows

Most brands only A/B test campaigns. But flows run 24/7 and touch every customer at critical moments. A 10% improvement in your abandoned cart flow's conversion rate generates revenue forever. Test your flows.

Mistake 5: Running One Test and Stopping

A/B testing isn't a project. It's a process. You should always have at least one test running across your campaigns and flows. The brands that grow fastest are the ones that test continuously.

Our A/B Testing Roadmap for New Clients

When we onboard a new Klaviyo client, here's the testing roadmap we follow. It takes about 8-12 weeks to get through the high-impact tests:

Weeks 1-2: Subject Line Tests (Campaigns)

  • Test 1: Short vs. long subject line
  • Test 2: Personalized (first name) vs. not personalized

Weeks 3-4: Welcome Flow Offer Test

  • Test: 10% off vs. 15% off vs. free shipping in welcome popup/email

Weeks 3-6: Abandoned Cart Timing Test (Flow)

  • Test: 1-hour vs. 4-hour first email delay

Weeks 5-6: Send Time Test (Campaigns)

  • Test: 10 AM vs. 1 PM vs. 7 PM

Weeks 7-8: Email Design Test (Campaigns)

  • Test: Image-heavy vs. text-heavy for product campaigns

Weeks 9-10: Flow Email Count Test

  • Test: 4-email vs. 5-email abandoned cart (add vs. remove the last email)

Weeks 11-12: SMS Addition Test (Flows)

  • Test: Email-only abandoned cart vs. Email + SMS

After this roadmap, we move into ongoing optimization: testing new subject line strategies, seasonal offer changes, and segment-specific content.

What Winning A/B Tests Look Like (Real Examples)

Here are real test results from our client accounts:

Test: Subject Line Personalization (Fashion Brand, 35K List)

  • A: "New arrivals just dropped" — 22.4% open rate
  • B: "[First name], new arrivals just dropped" — 28.1% open rate
  • Winner: B (+25.4% lift). Implemented across all campaigns.

Test: Abandoned Cart First Email Timing (Supplement Brand)

  • A: 4-hour delay — 3.2% conversion rate, $2.80 RPR
  • B: 1-hour delay — 4.8% conversion rate, $4.10 RPR
  • Winner: B (+50% conversion rate lift). Extra $3,200/month in recovered revenue.

Test: Welcome Offer (Beauty Brand, 45K List)

  • A: 10% off — 6.2% conversion rate from welcome flow
  • B: Free shipping — 7.8% conversion rate from welcome flow
  • Winner: B (+25.8% conversion lift). And since free shipping costs less than 10% off at this AOV, the margin was better too.

Test: Plain Text vs. HTML (Home Goods Brand, Win-Back Email)

  • A: Branded HTML template — 14.2% open rate, 1.1% click rate
  • B: Plain text from "Sarah from [Brand]" — 31.8% open rate, 3.6% click rate
  • Winner: B (+124% open rate, +227% click rate). Plain text dominated for this use case.

Test: Number of Products (Fashion Brand, Campaign)

  • A: 2 products shown — 2.8% click rate
  • B: 6 products shown — 4.2% click rate
  • Winner: B (+50% click rate lift). More options = more chances to catch interest.

Build Your Testing Culture

The difference between brands that grow their email revenue 5% per year and brands that grow it 30% per year? Testing. Consistent, methodical, well-structured testing.

You don't need fancy tools. You need discipline:

  1. Always have one A/B test running
  2. Test the high-impact variables first
  3. Wait for statistical significance
  4. Document your findings (we use a simple Google Sheet: test name, hypothesis, result, action taken)
  5. Implement winners and move to the next test

If you want help setting up a systematic A/B testing program in your Klaviyo account — or if you want us to review the tests you've already run and tell you what to test next — that's what our free audit covers.

Book your free Klaviyo optimization audit.


Mark Cijo is the founder of GOSH Digital, a Klaviyo Gold Partner agency that's driven $23M+ in revenue for 150+ eCommerce brands. His team runs 30-50 A/B tests per month because guessing is expensive and data is cheap.

Mark Cijo

Written by Mark Cijo

Founder of GOSH Digital. Klaviyo Gold Partner. Helping eCommerce brands grow revenue through data-driven marketing.

Book a free strategy call →

Want results like these for your brand?

Book a free call. We'll look at your data and show you what's possible.

Pick a Time

15 minutes. No pitch deck. Just your data and our honest take.