Meta’s Creative Testing Tool: Setup, Strategies, and Results
Meta recently launched a new creative testing feature that allows advertisers to A/B test individual ads.
This feature allows you to prevent Meta from optimizing the delivery of specific ads so that you can get a clearer picture of how each ad in your test performed given similar ad spend. Since this is an A/B test, there will be no overlap between ads (each person will only see one of the ads).
I recently gave it a try. In this article I will cover the following:
- How the creative testing feature works
- my experience
- Some complaints about how to do better
- creative testing methods
- Best practices and recommendations
Let’s get started…
how it works
You can launch a creative test from an existing or draft campaign. There are two important requirements:
1. Daily budget. This feature does not support lifetime budgeting.
2. Maximum volume bidding strategy. Cannot use Target CPA, Bid cap, or Target ROAS bid strategies.
When editing or creating an ad, scroll down to Creative Test (it should be immediately after the Ad Creative section).
Click Set Up Test. It should bring up a view that looks like this…
1. Select the number of ads you want to test.
You can create two to five test ads. This will generate a copy of the current ad you are editing and will be used in testing. Current ads will not be part of the test.
2. Set your testing budget.
This is the amount of your entire campaign or ad group budget set aside for this test. Meta recommends that you spend no more than 20% of your budget on this test to reduce its impact on existing ads.
If your campaign or ad group doesn’t contain any other ads, more of your budget will be used for testing.
3. Define how long the test should run.
This seems to be 7 days by default, but I don’t see any other guidance from Meta on this. Consider the budget used, the expected cost per result, and the total volume that will be generated to make your test results meaningful.
4. Define metrics to compare results with.
Cost per result is a reasonable choice for most situations, but you can also choose CPC, CPM, and cost per standard event, custom event, or custom conversion.
5. Click OK.
You are only confirming the settings at this time. You haven’t posted an ad.
6. Edit your ad.
Since you only created a copy of your ad, you’ll need to edit it before publishing it.
Make sure the differences between ads are significant and not subtle textual or creative changes. Otherwise, any differences in results are more likely to be attributed to randomness. Consider different formats, creative themes, customer personas, and messaging angles.
Also note that the ad you initiated this test from is not included in the test. It will be active or in draft status. If your ad is in draft status, you can decide to delete it before publishing.
7. Post your ad.
After you’ve finished editing the ads you’ll use in your test, publish them.
Meta should allocate a similar ad spend to each ad in the test. Note that you may see differences on day one depending on how long each ad moves through the review, approval, and preparation stages. If an ad is activated before other ads, it will receive the initial budget.
During the test, you can hover over the beaker icon next to the ad name to get top results.
8. After testing.
Once the test is complete, your ads will continue to run. But testing is complete, so Meta won’t be focused on maintaining similar ad spend across ads. You also shouldn’t expect Meta to automatically spend the most on the ad that performed best in the test.
You can get full test results in the experimental section.
I’ll share more information about my testing below.
my test
I created a test of five ads, focusing on format differences:
- Flexible format (9 images)
- Flexible format (9 videos)
- Carousel (9 images)
- Single image/video ads (show images or videos by placement)
- Carousel Promotion Two Lead Magnets (2 Images)
The performance goal is “Maximize conversions” where conversions are sign-ups. The ads primarily promoted free registration for The Loop, but one ad was a carousel ad promoting both The Loop and Cornerstone Advertising Tips.
Because I used a full registration campaign and expected the cost per result to be in the $3-5 range, I invested $50 per day for this test. It’s a little early for me to write this blog post.
In the experimental section, below is an overview of the results…
And the results table…
The best performer in my tests was using the flexible format version of the image, but it’s not that close. While there’s always some randomness in the results, clearly the lowest performing was the carousel version with two cards, each promoting a different lead magnet. It also shouldn’t be overlooked that the two top-performing versions are both in flexible formats, with the video version costing about $1 more per registration.
Meta also provides a visual breakdown by age…
And gender…
Functional complaints
My main complaint about the creative testing feature is that You can’t test existing ads. You must copy an existing (active or draft) ad, and the duplicate ad will be part of the test. What’s particularly confusing is that the ad that launched the test wasn’t included.
But it’s not just a matter of confusion. The inability to test existing ads also limits the value of the feature. If you have five ads in your active ad set, you may question why Meta allocates a higher percentage of your budget to one ad and a lower percentage to another. A test consisting of these five ads to confirm distribution will help answer these questions. But you have to create new ads.
This means either adding duplicate ads to an existing ad set or creating a separate ad set for testing. A separate ad set makes the most sense if you’re replicating something that’s already running. But as you can imagine, it would be much easier to keep your current set of ads and simply launch a test of those existing ads.
This can also be a lot of extra work because you have to generate duplicates that you then need to edit (you can’t individually copy every ad you need in a test). In my example, I used flexible formats and carousel versions, each containing nine creative variations. Having to do it all over again was a huge pain.
I’m hoping this is just an early version of creative testing and we’ll eventually be able to test existing ads.
creative testing methods
There are several ways to take advantage of this feature for creative testing…
1. Test your existing ad copy and creative mix.
This can get messy quickly because you are generating duplicates. If you want to test an existing ad copy and creative mix, I encourage you to create a separate ad set for this purpose. In this case, I also recommend using Advantage+ Campaign Budget.
This is actually what I did in the example above. I didn’t test new ad copy and creative combinations. I created a new ad set within an existing campaign that contained only test versions of these ads.
2. Test new ad copy and creative combinations.
This is actually where this feature is very useful in its current form. It might change the way you create ad sets from scratch. It works like this…
When you’re ready to start a new campaign or ad group with new creatives, start testing. Start with up to five very different ideas. Use different formats, creative themes, customer profiles, and messaging angles so you don’t get similar results with each of your ads.
Once testing is complete, allow the first batch of ads to continue serving. But learn from the results and generate a new batch of ads to test. In this case, add these ads to the original ad set.
Using this approach, creative testing can become an ongoing process for individual ad sets. You learn from each test to help you understand how to develop your next batch of products.
Best practices and recommendations
I encourage you to try Meta’s creative testing capabilities. I firmly believe this is a huge improvement in how advertisers currently have to approach creative testing. But I have some tips on how to get the most out of them…
1. Use leverage to produce meaningful results.
Consider your expected cost-per-action when determining your testing budget, test duration, and number of ads to test. I always start with the old rule of thumb of 50 optimization events per week. If you can’t do this, your results are meaningless. You want the results to be very clear so that if you run the test again, you will get similar results.
Even in the test above, I might run it longer next time. I was happy to see that the top performing ad itself had close to 50 conversions.
2. Test different ads that make sense.
If the ads in your test differ only slightly in text or creative, or if they all use the same format, you’re unlikely to get meaningful results. Learn more about Meta’s requirements for creative diversity in Age of Andromeda. The more different these ads are, the more meaningful your results will be.
3. Don’t micromanage test results.
Once testing is complete, your ads will continue to run regardless of creative testing settings. This means Meta will no longer distribute your budget evenly between ads. And Meta may not allocate the most budget to the best-performing ad in the test (or the least budget to the least-performing ad).
The same is true if you test existing combinations of ad copy and creative. You may find that the top performers are different from those favored by Meta.
Resist the urge to overreact to these results, especially when they represent a small sample size. I fully expect advertisers to be upset by this incident because they think what happened in the test was predictable. It can cause frustration if Meta doesn’t allocate budget in a way that’s consistent with test results.
4. Test and learn.
Your main goal with the creative testing feature shouldn’t be to fix the way Meta allocates budget. Instead, it learns from the results when the budget is evenly distributed.
What performs well? What’s not there? Apply these lessons to your next batch of ads.
now you
Have you tried this tool? What do you think?
Let me know in the comments below!