Before a good Facebook ad campaign comes to life, one must find the right ad ingredients. An ideal path? A/B testing Facebook ads.
Whether you’re a marketer, scientist, chef, or any other professional, I know you’re no stranger to testing your outputs before you officially launch them.
In the advertising world, it’s pretty common to hear the word “testing.”
Marketers with varying degrees of experience in creating, testing, and running ads on Facebook or in any other ad platform acknowledge its importance.
They test ad copy, creatives, headlines, call-to-actions (CTAs), audiences, and placements to help them find what works best based on their KPIs.
On Facebook, there are a few ways available to test your strategies. But there’s one that’s made just to find the best strategy, which is A/B testing Facebook ads.
What is A/B testing on Facebook?
When setting up your ads, you don’t pick the creative, copy, headline, and other parts of it just because your gut tells you to.
You pick them because you know they’ll work. And you know because you’ve gone through hours and days of testing to find which will get you the most results.
That’s what A/B testing Facebook ads is all about. You test your ad’s elements against each other to find out what variant will generate the most clicks, purchases, sign-ups, and other desirable actions for your ad.
You pick a variable to test, select the key metric (which will define the winner), and Facebook allocates equal budgets between the two ad variants you’re testing against each other.
After letting your test run for enough time, you can conclude the variations that work. There should be significant differences between your variations to ensure the results aren’t by random chance.
We all know chances aren’t keys to successful campaigns.
Why should you A/B test your Facebook ads?
Facebook A/B tests are made to help you find your best ad ingredients. You can’t just throw in this and that because they look good to you. You need to find out if they’ll work on your audience, not on yourself. That's the best way to boost your conversion rate.
A/B testing Facebook ads is also crucial to lowering your advertising costs. When you know what works best for your target audience, it’ll be cheaper for the algorithm to generate your desired results.
Lastly, A/B testing will gather more data on your audience. Because you normally target different audiences, it’s normal for one marketing strategy to not work on all your customers.
Testing a variety of strategies on your audiences will help gather more information and reach your customers better.
They’re all different, so there should be different ways for you to reach them.
What ad elements can you A/B test?
You can A/B test every variable that makes up your Facebook ads, and here’s a list that covers most of what you can test at each level:
- Campaign objective
- Budget optimization (ABO vs. CBO)
- Bidding method
- Primary text
- Creative (format, colors, messaging, style, etc.)
- Product sets for Carousel and Dynamic Product Ads
- Landing pages
Simply put, anything you can change in your campaign, ad set, and ad can be split tested.
When it comes to testing creatives, there’s more to it than simply testing two versions of an image or video. You need to be specific about which part of the creatives you’re testing.
Are you testing the colors, messaging, format, style? Being more specific with your creative’s elements allows you to gain better insights into the winning variants.
If you don’t specify which part of the creative you’re testing, you won’t find the formula of converting creatives. Who knows, it can be one word in the text in your creative, a specific color, etc.
There’s also the case of testing audiences. Audience size can make a huge difference in split test results, and you need to keep that in mind when testing lookalike audiences (LLAs) or other audiences that differ significantly in size.
Test different LLA percentages to see what size works, but don’t let the difference between the % go too large. That might result in contaminated results because of significant differences in audience size.
Moreover, if you’re testing demographics (such as gender), location, age, and detailed targeting options, be wary about the differences in estimated audience size.
How to launch an A/B test on Facebook
There are three main use cases for creating an A/B test in Facebook Ads Manager:
1. Create a new campaign for split testing
Just like creating any new campaign, start by clicking on the green “Create” button. Then, choose your campaign objective and name your campaign. Think of an indicative name that will state it’s an A/B test and explain what you’re testing.
At the campaign level, make sure to toggle “Create A/B Test” on.
Next, set up your campaign, ad set(s), and ad(s) according to the element you want to split test and the budget you wish to allocate. We will discuss budget recommendations later in this article.
Once you click “Publish,” you will see the “Create A/B Test” popup. This is where you set up the actual test. Click “Get Started,” and select whether you want to make a copy of the campaign or ad set you’ve just created, or, alternatively, test it against an existing asset in your ad account.
Click “Next,” and you’ll be taken to the next stage, where you should choose the variable you wish to split test and the asset you wish to copy. The variables you can test are:
Creative testing is done at the ad level, but to control the budget each ad receives, Facebook's split testing feature will duplicate your entire ad set. That’s why it’s best to have one ad per ad set for creative testing.
Audience and placement are tested at the ad-set level. This means your ad set will be duplicated in this case too.
However, if you wish to A/B test a different variable, select “custom” in the variable dropdown menu. This will duplicate the entire campaign you’ve just created and let you set up the test manually.
Click “Next” to review and publish your A/B testing campaign. In the Review screen, you can give your test a name (not to mix with the campaign name). Moreover, you need to select the metric you wish to use to select the “winner” of this test.
Lastly, choose a start and an end date for the test. We will discuss the recommended duration later on, but an interesting option could be setting the test to end earlier if a winner is found. This can save you redundant ad spending in case Facebook identifies a clear winner before the end date is reached.
Click “Duplicate Ad Set,” and your test is ready to roll. Once you’re done setting up your test using the Create A/B Test window, you should see your new ad set and its copy with an Erlenmeyer flask icon to its left.
Now you can edit the copy you created according to the element you wish to test and let the Facebook algorithm do its magic.
2. Duplicate an asset to A/B test it against the original
If you have a campaign, ad set, or ad in your ad account that has shown interesting results, you can split test one of its variables to optimize its performance. The easiest way to do this is by duplicating it and A/B testing the copy vs. the original.
There are two ways to do that:
- Clicking the “A/B Test” button: Tick the asset you are interested in testing and click the “A/B Test” button, which is located next to the “Create” button. Alternatively, you can click the button and select the asset on the next screen.
- Duplicating the asset: Hover over the desired asset and click “Duplicate.” In the popup that will appear, select New A/B Test and choose the variable you wish to split test. Then, click “Continue to Test Setup.”
In both cases, the next screen you’ll see would be the “Test Setup” screen. Here you can change the campaign you’ve selected for testing and select your variable (if you haven’t done so previously).
If you wish to test audiences or placements, you will need to select the ad set you wish to duplicate. In case you want to split test creatives, you will need to select the ad you want to test. However, note that Facebook will still duplicate the entire ad set, not just the ad.
If you set up the A/B test at the campaign level, selecting a custom variable will allow you to duplicate the entire campaign and set up the variants yourself.
Note: When split testing a CBO campaign, you will be notified that two new versions of the original campaign will be created. This is because Facebook needs to ensure that the budget will be distributed evenly for equal chances at conversions between the variants you want to test. Alternatively, you can turn off CBO in the original campaign.
Click “Next” to progress to the Test Settings stage. Here you can give your test an indicative name, schedule the start and end dates, and select the key metric that will determine the winner. You can select more than a single metric under “Additional Metrics.”
In this screen, Facebook will also estimate your Test Power. This is the likelihood of detecting a difference in your ads and declaring a winner. Facebook recommends setting up tests with a minimal likelihood of 80%.
Note that Test Power estimation is currently not available for web conversion metrics.
Once you click “Review Test,” you will be taken to the duplicated asset you have just created. Now you can adjust it according to what you wish to split test. You can also click “Edit Test Settings” in the right sidebar to adjust your test name, schedule, and key metric.
Once you’re ready, hit “Publish” and let your test run.
3. Split test existing assets
If you have existing campaigns, ad sets, or ads you wish to split test, you can do that in two ways:
Compare Existing Ads in the “Create A/B Test” popup
Click the aforementioned “A/B Test” button in the Ads Manager. At the top of the Test Setup screen, click “Compare Existing Ads.” This tool allows you to A/B test up to five assets at a time. Click “Next,” and the rest of the process is identical to the one described in the previous section.
Facebook Experiments tool
In the Business Manager, open the “All tools” menu by clicking the hamburger button on the left sidebar. Scroll down to “Analyze and report,” and then click “Experiments.”
The Experiments tool allows you to run different kinds of tests: A/B tests, brand surveys, Advantage Campaign Budget (formerly “Campaign Budget Optimization) tests, and Cross-Channel Conversion Optimization tests. Under “A/B Test,” click “Get Started.”
This tool also allows you to test up to five different assets. These may be ad sets, campaigns, or campaign groups, which are collections of a few campaigns you can create to run a test at a larger scale.
After you select the assets for comparison, schedule the experiment, give your test a name, and set the key metric, click “Review Test Details.”
How long does A/B testing Facebook ads take?
Aside from the variables you want to test, another factor that will define the results of your split test is its duration.
Facebook gives you the option to choose a duration between one to 30 days, and picking the right length for your experiment is crucial to its success.
While Facebook recommends you run the experiment for at least seven days, there are still factors that you need to consider when determining how long you should run an A/B test.
The duration can be dependent on what you want to achieve with your test, your experiment’s budget, the size of your audience, and the number of ad variations, and the time it usually takes people to convert after they see your ad.
But if you want to start on a solid foot, it’s good to follow Facebook’s recommendation.
Since your customers’ browsing and purchasing habits differ each day of the week, a seven-day test will let you cover all of them.
Another thing is that each variation should run for at least 5 to 7 days to generate useful results. The duration of the split test would be determined by how long it usually takes for your customers to convert.
But if you see one of your variations outperforming the others by a significant margin, you can conclude your experiment sooner.
How much should you spend on A/B testing Facebook ads?
The budget of your split test will be largely determined by your average cost per result, the number of variations you want to test, and how much you’re willing to spend on a test.
For instance, your cost per result is $15, and you want to test four ad variations. A good starting point is to use ⅓ of your cost per result as the daily budget for each variation.
Note that since the budget is set at the ad-set level, if you wish to test creatives or any other element at the ad level, you better launch each variation in a different ad set to control the spending on each one.
That means the experiment’s budget should be at least $5 per variation. Running them for at least 7 days will bring us to $5 * 4 variations * 7 days = $140.
If you feel like that’s too much, I highly recommend starting with fewer variations. Test two variations for a week and gradually increase your experiment’s size as you see great results.
But if you can spend more than that minimal amount, there should be no issue with having ½ or 1x of your cost per result as the daily budget for each variation.
It can increase the chances of your test’s assets to drive conversions and produce more valuable insights.
In the end, having enough budget is the main thing with spending on split tests. You want your variations to have enough and equal budgets for even conversion chances and optimal results.
Tips to make your split tests more effective
When done right, A/B testing Facebook ads work wonders for advertisers. Even to the seasoned ones in the scene.
You might be knowledgeable enough to know what should work for an ad account and what won’t, but you can’t entirely rely on previous results for future campaigns. So test, test, and test. But ensure that your experiments are effective.
Here are some best practices that’ll help you conduct effective tests:
1. Test one variable at a time
If I didn’t stress this enough -- here’s another one. Keep your split tests to one variable at a time. A single variable in an experiment makes it easier to gain valuable insights into what works.
Having more than one variable in your A/B test results in longer, more expensive tests that will not always give you useful results.
2. Use the right audience
Any audience you include in your split tests should be large enough to not get saturated while your test runs. If you think an audience is big enough for your experiment, keep in mind that Facebook divides the audience to accommodate all ads.
If the sample size is too small for your budget and test duration, your ads will encounter delivery issues that will increase your cost per result.
Moreover, when you’re testing elements other than audiences (e.g. creatives), you need to use an audience that you’ve already gotten profitable results from in your past campaigns. This is because you need to ensure that your variants are the ones making a difference, not the audience.
3. Make your hypothesis measurable
A key feature of A/B testing Facebook ads is your choice of metric. In each split test you conduct, Facebook asks how you want it to determine the winning asset (e.g. cost per result, cost per click, etc.).
In conducting any experiment, you first need to come up with the question you want to answer.
For instance, you want to know the best placement to reach a broad audience of people aged 21–25 better. You can narrow this hypothesis down to a metric like your cost per 1,000 impressions (CPM) for each placement you want to test.
4. There should be a significant difference between variations
The point of A/B testing is to gain insights that can help you with your future campaigns. Testing helps you find what works between A, B, and C.
If you don’t make significant changes to the elements you’re testing under each variation, your split test could yield results that aren’t insightful enough to tell you what made a particular variation the winner because they all look the same.
You need to set a budget when split testing, and you should make the spend count by structuring your test in a way that it’ll produce results useful to your business in the future.
Skip A/B testing Facebook ads with Madgicx
Here at Madgicx, we offer valuable insights about what elements of your ads generate the most profit without the need to run multiple A/B tests.
Our data unification and visualization of your top performers and scalable assets offer you better insight into each of your audiences, creatives, and ad copies - even if they’re in separate campaigns and ad sets.
With our Audience Launcher, you can select from 100+ preset audiences for testing, including AI-powered ones, and launch them in a few clicks. To get unified performance data for each audience you launch, you can use our Audience Studio.
With our Creative Insights tool, you can easily identify your scalable and underperforming creatives.
Our Ad Copy Insights tool does exactly the same for your ad copies. All without running dozens of A/B tests to find the best creative and copy. Moreover, you can leverage our object recognition technology to tell which elements in your creatives and ad copies drive results.
Once you’ve rounded up your top-performing assets, you can easily scale them and double down on your success.
Time to split test your Facebook ads
Now that you’ve got the hook of what A/B testing Facebook ads is and how you can make it an asset to finding what works, it’s about time you go all science-y on this and launch your experiments!
Please don’t forget these things while you set your test up:
- As much as possible, stick to one variable in each test.
- Make sure the ad variants have a major difference.
- Allocate enough budget to each variation.
- If you test creatives, for example, make sure to use a proven audience, and vice versa.
Once you’re all set, refrain from touching your ads while they’re in the lab. If you do touch them, you’ll contaminate the results, and they won’t be useful.
Now, put your lab (or marketer) coat on and get to it!
Don’t want to spend months and thousands of dollars before knowing what works? Madgicx lets you instantly see the elements in your ads that drive results so you can double down on them and reach your target ROAS way faster.
Natalie values content writing as a way to inform and ask people questions. To her, posing questions spark her own and others' curiosity and encourage learning.