Microsoft Advertising has launched a new feature called ‘Experiments’ that lets advertisers test their campaign changes confidently.
“Sometimes, it’s not immediately clear whether a new bidding strategy, setting, or feature is the best move for you. How can you make sure that you’re making the right decisions for your Microsoft Advertising campaigns without wasting precious time and resources? With experiments rolling out globally, you can now test out those campaign changes with full confidence.”, states Subha Hari & Piyush Naik, the Program Managers of Microsoft Advertising.
How is this useful to me?
With the help of the ‘Experimenter’ feature, you can create a duplicate version of your campaign that provides a controlled environment for you to monitor a change, without actually launching it.
In this way, you can run a true A/B test within a campaign and see if a particular update will work well for you and your business.
Some examples of change include:
- Ad Copy- Test various messages and call-to-action on your ads.
- Landing Page URLs- See whether different landing pages give a better performance.
- Bidding Strategies and Modifiers- Allocate a percent of your campaign budget towards a smart tactic, or try out different bid adjustments.
To experiment ‘Experiments’:
- You will see a new tab on the Campaigns page on Microsoft Advertising. Then select the Campaign you wish to test.
- Choose the Start Date and the End date.
- Set the Experiment Split– the percentage of the original campaign’s daily budget and ad traffic that you want to allocate for this experiment. (Recommended split given by Program Managers is 50%)
- Check no errors were made in your Experiment Campaign at Experiment status.
Below are some of the tips given by Microsoft Ads :
- Run your experiment in A/A mode for two weeks. This means that your original campaign and experiment campaign will remain the same for the first two weeks after getting the experiment created.
This will help validate if it’s running the same as the original so that you can run a true A/B test.
- After those two weeks are up and you validate the performance isn’t different in any statistically significant way, you can start the actual A/B test of your experiment.
- Once you make changes to your duplicate campaign, you should run the experiment for at least two weeks to compare performance.