Klaviyo & EmailNovember 25, 2025

A/B Testing Signup Forms in Klaviyo

How to A/B test Klaviyo signup forms to increase your opt-in rate. What to test, how to set it up, and the changes that actually move the needle.

Mark Cijo

Mark Cijo

Founder, GOSH Digital

A/B Testing Signup Forms in Klaviyo

Your signup form is the entry point to your entire email program. Every subscriber, every dollar of email revenue, every automated flow — it all starts with someone filling out that form.

And most stores set it up once and never touch it again.

That's like building a sales funnel and never optimizing the landing page. Your signup form has a conversion rate, and that conversion rate can be improved. Sometimes dramatically. We've taken forms from 2% to 6% submission rates through systematic testing. That's 3x the list growth from the same traffic. Same visitors. Same products. Just a better form.

Let me show you how to test your way to a better form.

Setting Up A/B Tests in Klaviyo

Klaviyo's form builder has built-in A/B testing. Here's how it works:

  1. Go to Signup Forms in your Klaviyo account
  2. Open the form you want to test (or create a new one)
  3. Click "Add Variation" — Klaviyo creates a copy of your form
  4. Edit the variation to change the element you're testing
  5. Set the traffic split (50/50 is standard for most tests)
  6. Set the test duration (minimum 2 weeks, ideally 4 weeks for statistical significance)
  7. Publish and let it run

Klaviyo will automatically split your website traffic between the two form variations and track submission rates for each.

Important: only test ONE variable at a time. If you change the headline, the image, and the incentive all at once, you won't know which change caused the improvement. Isolate variables. Test one thing. Get a result. Then test the next thing.

What to Test (In Priority Order)

Not all form elements have equal impact on submission rates. Here's what to test first, ranked by typical impact.

1. The Incentive (Highest Impact)

The offer you make in exchange for the email address is the single biggest driver of signup rates. Test different incentives:

  • 10% off vs. 15% off vs. 20% off
  • Percentage off vs. dollar amount off ("$10 off" vs. "10% off")
  • Discount vs. free shipping
  • Discount vs. free product sample
  • Discount vs. exclusive content (guide, quiz results, early access)

What we typically find: 15% beats 10%. Free shipping beats 10% off for most stores. Dollar amounts beat percentages when the order value is high ("$25 off" sounds bigger than "10% off a $250 order"). And for some brands, non-discount incentives (exclusive access, free gift with purchase) outperform discounts entirely.

The incentive test should be your first A/B test. It usually has the biggest impact.

2. The Headline (High Impact)

The headline is the first thing visitors read on your popup. It sets the frame.

Test different approaches:

  • Benefit-focused: "Get 15% off your first order"
  • Curiosity-focused: "We have something for you"
  • Social proof: "Join 50,000+ subscribers"
  • Problem-focused: "Stop overpaying for skincare"
  • Direct: "Want 15% off?"

The benefit-focused headline (stating the incentive clearly) is the most reliable performer. But for brands with strong personality, curiosity-based headlines can win.

Keep headlines to 5-8 words. Don't try to fit your brand story into the headline. That's what the body copy is for.

3. The Timing/Trigger (High Impact)

When the form appears affects how many people see it and how receptive they are.

Test different triggers:

  • Immediate popup (appears after 3-5 seconds) vs. delayed popup (appears after 15-30 seconds)
  • Exit intent (appears when the mouse moves toward closing) vs. scroll-based (appears after scrolling 50% of the page)
  • Time-based vs. page-count-based (appears after viewing 2+ pages)

What we typically find: a slight delay (8-15 seconds) outperforms immediate popups because the visitor has had time to see your site and develop interest. Exit intent popups have lower submission rates but catch people who would have left anyway — they're incremental. Scroll-based triggers work well for content pages (blog posts).

4. Form Design and Layout (Medium Impact)

Test structural changes to the form:

  • One-step form (email field and submit on one screen) vs. two-step form (first screen is a yes/no question or button, second screen is the email field)
  • Popup vs. flyout (slides in from the side) vs. embedded (integrated into the page)
  • Full-screen takeover vs. small popup

Two-step forms consistently outperform one-step forms by 20-40%. The first step is a micro-commitment ("Yes, I want 15% off"), and once someone clicks "Yes," they're much more likely to complete the email field.

Full-screen takeovers have higher submission rates but also higher bounce rates from people who find them annoying. Test carefully.

5. The CTA Button (Medium Impact)

The button text matters more than you'd think.

Test:

  • "Subscribe" vs. "Get My Discount" vs. "Unlock 15% Off" vs. "Sign Me Up"
  • Button color (your brand color vs. a contrasting color)
  • Button size (larger buttons are easier to tap on mobile)

"Subscribe" is the weakest CTA in almost every test. It sounds like a commitment. "Get My 15% Off" or "Unlock My Discount" performs better because it reinforces what they're getting.

6. Images (Medium Impact)

Test with and without a product image on the form. Test different images: lifestyle shot vs. product shot vs. no image.

What we find: a relevant product image improves submission rates by 10-20% compared to a text-only form. But the wrong image (generic stock photo, unrelated imagery) can actually decrease performance.

7. Number of Fields (Lower Impact for Email-Only Forms)

For email-only forms, there's only one field, so this isn't testable. But if you're collecting additional data (name, phone number, preferences), test:

  • Email only vs. email + first name
  • Email only vs. email + SMS opt-in (phone number)

Each additional field reduces completion rate by 10-25%. Only add fields if the data they provide is worth the signups you'll lose.

Running a Valid Test

A/B testing is only useful if the results are statistically valid. Here's how to avoid false conclusions:

Sample size matters. You need at least 1,000 form impressions per variation to get a reliable result. For low-traffic sites, this means running tests for 4-8 weeks. Don't call a winner after 3 days and 200 impressions.

Statistical significance. Aim for 95% confidence before declaring a winner. Klaviyo's A/B test reporting shows this. If the confidence level is below 90%, keep running the test.

Watch for external factors. If you launch a sale midway through the test, the results will be skewed. Try to run tests during "normal" traffic periods. If something unusual happens (viral post, major sale, site outage), extend the test to collect clean data.

Mobile vs. desktop. Your form might perform differently on mobile and desktop. Klaviyo doesn't split test results by device type natively, but you can create separate forms for mobile and desktop and test each independently.

A Testing Calendar

Don't run tests randomly. Build a testing calendar:

Month 1: Test the incentive (biggest impact, sets the foundation) Month 2: Test the headline and body copy Month 3: Test the trigger timing Month 4: Test one-step vs. two-step Month 5: Test the CTA button and design elements Month 6: Test imagery

After six months of systematic testing, your form will be significantly better than where it started. Then cycle back and retest — audiences change, and what worked six months ago might not be optimal today.

Beyond the Popup: Test These Too

Popups aren't your only signup form. Test these as well:

Embedded forms. The signup form in your footer, on your blog sidebar, or on a dedicated landing page. These have different dynamics than popups (lower visibility but no interruption).

Flyout forms. Slide-in forms that appear in the corner of the screen. Less intrusive than popups. Good for returning visitors who already dismissed the popup.

Landing page forms. Dedicated signup pages linked from ads or social media. These should be tested like any landing page: headline, body copy, CTA, layout.

Exit intent forms. Test a different message on your exit popup than your main popup. "Wait! Don't leave without your 15% off" can catch people who dismissed the initial popup.

The Bottom Line

Your signup form has a conversion rate, and that rate is improvable. Systematic A/B testing — one variable at a time, with sufficient sample sizes, measured to statistical significance — is the path to consistent list growth improvement.

Start with the incentive. Then the headline. Then the timing. One test per month. After six months, you'll have a form that converts 2-3x better than what you started with.

That means 2-3x the subscriber growth. 2-3x the flow entries. 2-3x the email revenue. From the same traffic.

If you want help optimizing your Klaviyo signup forms and overall list growth strategy, book a call with our team. We'll audit your current forms and show you the quick wins and the long-term testing plan.

Mark Cijo

Written by Mark Cijo

Founder of GOSH Digital. Klaviyo Gold Partner. Helping eCommerce brands grow revenue through data-driven marketing.

Book a free strategy call →

Want results like these for your brand?

Book a free call. We'll look at your data and show you what's possible.

Pick a Time

15 minutes. No pitch deck. Just your data and our honest take.