Overview
Compare ad performance within an ad group
Last updated
Compare ad performance within an ad group
Last updated
A single ad group test compares the ads peformance within one ad group. This is the most popular ad testing approach and is applicable to all PPC accounts. For other types of ad testing, please look at .
In Adalysis, you can test your existing ads based on their historical performance. You don't need to start any tests or create new ads. Instead, Adalysis tests existing ads automatically, wherever two or more ads compete. New ads will be included in tests automatically, for both text and image ads.
A single ad group test needs at least two active ads to run. For image ads, only ads with the same size are tested. By default, Adalysis compares for every test. However, you can customize all metrics.
Adalysis will automatically run single ad group tests, wherever it finds competing ads. This is a major time-saving compared to manual approaches. However, you can also create manual tests.
Find your test results under Ad testing > Single ad group.
Your Adalysis account is synchronized with your PPC account every day at around 4am. Adalysis will then:
Run an ad test using the performance data of the date range.
All dates and times are based on the time zone of your Google Ads or Microsoft Ads account.
You can also manually run the tests yourself using different parameters. For example:
Override the current threshold values for this test run. (Your global threshold values will stay unchanged.)
Here are some common reasons Adalysis users choose additional manual tests:
Compare test results with different thresholds.
Test date ranges, to compare to the automated tests.
Test a subset of ads within the ad group, e.g. only mobile-preferred ads.
See results that aren't statistically significant.
The single ad group test results show:
The date range used for the test.
The number of active ads found and tested for each ad group.
The algorithm's confidence for each test metric. A confidence level of less than 90% is displayed as --.
The aggregate performance of all ads tested.
Click on any ad group name to see:
1. The winner and loser ads for each metric:
One winner ad for each metric with enough data, highlighted in green.
One loser ad for each metric with enough data, highlighted in red.
Neutral ads (neither winners nor losers), highlighted in yellow.
Metrics with not enough data won't show a winner/loser.
2. The confidence the algorithm has in the result
Depending on your results, you may choose to update your ad groups:
Pause the losing ad. All changes in Adalysis are pushed immediately to Google Ads or Microsoft Ads.
Create a new ad to replace the losing one.
Delete a test result or mark a test result as 'analyzed'. This changes the test result's color in the main list, so you can keep track of the results you've looked into.
Scan your ad groups and calculate date ranges during which the active ads within an ad group were running simultaneously. This is based on the of each ad.
You can see statistically significant results (compatible with your ) on the Automated daily tests tab. Test results from the previous day are removed.
Specify a . Using the "common date range" option mimics how the automated tests run.
Choose to see all test results irrespective of level.
3. The projected if you pause the loser ad.
You can navigate through the test result using the left/right arrows, or click Back for all test results.
Record notes regarding the ad testing progress or plans using the function.