AI Platform Logo
Ad Creatives

A/B Testing

6 min readAd Creatives

The best-performing ad on your account is never the one you thought would win. A/B testing is how you replace opinion with data — systematically generating variations, running them in parallel, and letting performance decide. AIMS makes it easy to produce the volume of variants that meaningful testing requires.

What to test — in order of impact

Not all test variables are created equal. Prioritise variables with the highest potential impact on conversion:

VariableImpact levelExample
Hook / opening (video)Very high"Don't buy this until you've read this" vs "Introducing our new formula"
Main visual / formatVery highUGC-style video vs polished product showcase
Core offer / messageHigh"Free shipping" headline vs "Save 30% today"
Call to actionHigh"Shop now" vs "Get yours" vs "Claim 20% off"
Copy lengthMediumShort punchy copy vs longer benefit-led explanation
Colour schemeMediumBrand colours vs contrasting accent palette
Minor layout tweaksLowLogo position, font size adjustments

Test big swings, not small tweaks

Changing a button from blue to green is not a meaningful test. Changing from a UGC-style video to a polished product ad is. In 2025, when CPMs have risen ~18% year-on-year, the brands winning on paid social are testing fundamentally different creative approaches — not micro-optimising within the same format.

One variable at a time

The cardinal rule: change one element at a time. If you test a different hook AND a different visual AND different copy simultaneously, you will never know which change caused the result. Every test should have a hypothesis: "I believe changing the hook from benefit-led to curiosity-gap will increase the 3-second hold rate by 15% for this audience." You can only prove or disprove that with an isolated test.

Setting up a test on Meta or TikTok

On Meta Ads Manager

  • Create a Campaign with Campaign Budget Optimization (CBO) enabled.

  • Within the campaign, create one ad set per audience (or use a single audience if you're isolating creative, not audience).

  • Within that ad set, upload 2–5 ad creatives (your variants). Meta will automatically allocate budget towards the better-performing creatives.

  • Alternatively, use Meta's native A/B Test tool (under Experiments) for a statistically cleaner split — this prevents the algorithm from picking a winner too early.

On TikTok Ads Manager

  • Use the Smart Creative feature, which accepts multiple creatives and automatically serves the best-performing to each audience segment.

  • For a cleaner manual test, duplicate the ad group and change only the creative. Keep budget identical across both groups.

How long to run a test

The most common testing mistake is calling a winner too early. Statistical significance requires a minimum of 50 conversion events per variant — not clicks, conversions. For most DTC brands with a few hundred conversions per week, this means running tests for 14–21 days. If you call a winner after 3 days and 8 conversions per variant, your data is noise.

At minimum, run every test for 7 days to capture day-of-week variation (weekday vs weekend shopping behaviour differs significantly for most product categories). Allocate a minimum of £300–500 per variant before drawing conclusions.

The metrics that matter

  • Cost per Acquisition (CPA) — how much it costs you to get one sale or conversion. The primary performance metric.

  • Return on Ad Spend (ROAS) — revenue generated per pound/dollar spent on ads. Directly measures profitability.

  • Hook Rate (3-second view rate) — what percentage of people who see the ad watch at least 3 seconds. Critical for video ads. Target 30%+.

  • CTR (Click-Through Rate) — useful directionally but not a primary metric. High CTR with low conversion rate means your ad is interesting but your landing page or offer isn't converting.

Ignore likes, comments, and shares as performance metrics. These vanity metrics do not correlate with conversion performance. An ad with 500 comments saying "love this!" and a 4× ROAS is worse than an ad with zero comments and a 9× ROAS.

Creative fatigue and refresh cadence

All creatives eventually burn out. On high-spend Meta accounts, creative fatigue sets in after an individual viewer has seen the same ad 3–5 times — their CTR drops, your CPM rises, and performance deteriorates. The best-performing brands produce 3–5 new creative variants per week per product. AIMS makes this volume achievable: generating 5 new static ad variants or 3 new video hooks takes minutes, not hours.

Can't find what you need?

Our support team is ready to help with any questions.

AIMS - AI-Powered Content Creation