How to structure creative tests for maximum learning

In the previous article, we explored why creative testing is so valuable—it’s not just about finding a winning ad but about understanding your audience and refining your creative strategy over time. Now, let’s get practical. How do you actually set up a creative test that delivers meaningful insights? What should you expect along the way? And how can you act on the results to make smarter creative decisions?

This article is designed to help you understand the process so you know what to ask for, how to interpret results, and how to ensure your creative testing delivers actionable insights.

Why you need a strategy before you start testing

Creative testing isn’t about throwing different ads into a campaign and hoping something sticks. Without a clear plan, you risk wasting budget and ending up with confusing results. The goal is to learn why certain creative performs better—not just what works.

The key to effective testing is focusing on a single variable at a time. If you change too many things—like the creator, messaging, and visuals—you won’t know which adjustment caused the shift in performance.

Before you begin, define a clear learning objective. Instead of aiming for vague goals like “improve engagement,” consider more specific questions:

  • Will creator-led content increase click-through rates compared to traditional product shots?
  • Does focusing on audience benefits instead of product features lead to higher conversions?
  • Do shorter videos hold attention better than longer formats?

Having a well-defined question ensures you’re testing for actionable insights rather than just chasing numbers.

Choosing what to test

Once you’ve identified what you want to learn, the next step is to decide which creative element—or “lever”—to test. Pulling the right lever helps you pinpoint what influences performance.

Some common creative elements to test include:

  • Creator type: Testing different creators can reveal how authenticity, tone, or demographic representation affects engagement.
  • Messaging angle: Trying out benefit-driven versus feature-focused messages can show what resonates with your audience.
  • Opening hook: The first few seconds of a video are critical. Experimenting with different visuals or statements can help improve attention rates.
  • Visual style and production quality: High-production content isn’t always better—sometimes lo-fi, raw videos perform best.
  • Tone and emotion: Testing playful, serious, or aspirational tones can highlight how emotional appeal affects audience response.
  • Call-to-action (CTA): Adjusting the wording or placement of your CTA can significantly influence conversion rates.

Focus on testing one lever at a time so you can confidently attribute any performance differences to the change you made.

How to set up your creative test

The structure of your test plays a huge role in the reliability of your results. Different testing methods offer various benefits, depending on your goals.

Method 1: Multiple creatives in one ad set

This is the fastest and simplest approach. You place several creative variations into a single ad set, and the platform’s algorithm determines which ones to show more often based on early performance.

This method is best when you’re looking for quick optimisations in a live campaign. The platform will quickly favour high-performing ads, allowing you to scale them faster.

However, there are some drawbacks. Because the algorithm prioritises ads that perform well early on, weaker variations may not get enough exposure to provide meaningful data. This approach is great for identifying immediate winners but less effective for understanding why one creative outperforms another.

Method 2: One creative per ad set

For more accurate and reliable insights, use a structure where each creative variation sits in its own ad set with the same budget and targeting settings. This ensures that every creative gets equal opportunity to perform, providing clearer data on what’s driving differences in results.

While this method requires more time and budget to gather statistically significant data, it offers a deeper understanding of audience preferences and creative effectiveness. This approach is ideal if you’re looking to apply learnings to future campaigns, not just improve current performance.

What to expect during the test

Creative tests follow a natural progression, and understanding the phases helps set realistic expectations:

  • Initial exploration (Days 1–2): The algorithm experiments with different audience segments. Results may be inconsistent, so avoid making early judgments.
  • Emerging trends (Days 3–5): High-performing creatives start gaining momentum as the platform narrows in on engaged audiences.
  • Stabilisation (Days 5–7+): Performance begins to level out, revealing clear winners.

Sometimes, an ad that starts slowly will improve after a few days as the platform’s algorithm learns who responds best to it. Unless performance is drastically underwhelming from the start, give creatives enough time to stabilise before making decisions.

Interpreting your results and knowing what to do next

Once your test has run for a sufficient period, review the data holistically. Single metrics rarely tell the full story. Look at how different data points interact:

  • If you have high attention metrics (like thumb-stop rate) but low clicks, you’ve caught your audience’s eye but haven’t given them a compelling reason to engage further.
  • High video completion but low conversions means you’ve held their attention, but the message or offer isn’t motivating enough to prompt the next step.
  • Strong click-through rates paired with high bounce rates suggest the ad creates interest, but the landing page doesn’t meet expectations, causing people to drop off quickly.

Focus on understanding why certain variations performed better. Did a creator’s tone feel more relatable? Did shorter videos improve engagement but hurt message retention? Use these insights to refine future creative briefs.

Should you keep underperforming creative running?

There’s a common question when testing: Should you pause low-performing creatives or keep them running?

Keeping all creatives live during the early stages can be beneficial. The algorithm uses performance data to refine audience targeting, and sometimes, variations that underperform initially pick up momentum as the platform better identifies who to serve them to.

However, if a creative consistently underperforms beyond the stabilisation phase—especially across key metrics like engagement and conversion—it’s best to pause it and allocate budget to higher performers.

A balanced approach is to introduce new creative variations while phasing out weaker ones gradually. This prevents abrupt shifts that could disrupt learning and helps maintain campaign stability.

Your audience is already telling you what works. Creative testing is how you listen—and improve.

It’s not just about choosing the best ad—it’s about uncovering what your audience responds to and why.

By starting with clear questions, testing one variable at a time, setting up your tests thoughtfully, and giving them time to stabilise, you’ll gain insights that go far beyond a single campaign. And the more you test, the better you’ll understand what creative resonates with your audience—allowing you to build stronger, more effective content moving forward.

Share this article
No items found.

Ready to produce customised photos & video for your brand?

Want to get paid to create visual content?

Get access to paid work opportunities with global brands. Register your interest by sharing some examples of your work.

Learn how top brands are improving creative performance — straight to your inbox.

Sign up to receive the latest content creation tips and tutorial

Latest resources

Quick take
Want Better Creative? Listen Closer.
Quick take
Metrics are your audience talking to you
Quick take
The problem with ‘non-working’ media and what to do instead

Unlock creative effectiveness at scale

Find out how we can help you deliver better performance today and a stronger brand tomorrow

© 2025 Copyright Creatively Squared