Adalysis help center
  • Onboarding
    • Getting started
      • Quick start
      • Customize your setup
        • Customization scope
        • Team members
        • Notifications
        • Global settings template
        • Slack integration
        • Customize Adalysis for your team
      • Account organization
        • Access & structure
          • Account tags
          • Account filters
          • PPC accounts
        • Usage
          • Change history
          • User login history
          • Notes
        • Data management
          • Login email address
          • Contact email address
          • Data refresh frequency
          • Paused campaigns workaround
          • Filters
      • Account management
        • Checklist for teams
        • Ongoing account management
        • Workflows
        • Onboarding a new account
  • Tools
    • Audit
      • Audit overview
      • Prebuilt alerts
      • Prebuilt alert settings
      • Custom alerts
      • Automatic actions
      • Prebuilt alert list
        • Campaigns
          • Search partner traffic
          • User location targeting
          • Network targeting
          • Account auto-apply settings
          • Google's own recommendations
          • Broad match keyword setting
          • Automatic text asset creation setting
        • Ad groups
          • Ad groups with no RSAs
          • Ad groups with duplicate ads
          • Ad groups with no ads
          • Ad groups with one ad
          • Ad groups with too many ads
          • Ad groups with no keywords
          • Ad groups with too many keywords
        • Ads
          • Ads with a falling CTR
          • Poorly performing RSA assets
          • Disapproved RSA assets
          • Ads with misspellings
          • Text ads that are not fully approved
          • Disapproved text ads
          • Disapproved image ads
          • Ads using unsecure URLs
        • Keywords
          • Keywords with poor conversions
          • Keywords with a rising CPC
          • Keywords with a rising CPA
          • Keywords with a falling ROAS
          • Duplicate (or similar) keywords
          • Keyword conflicts
          • Keywords with incorrect match types
          • Keywords with unsecure Final URL
          • Keywords with poor ad relevance (quality score)
          • Disapproved keywords
          • Keywords generating expensive conversions
          • Keywords with potential for higher impression share
          • High volume keywords with a quality score drop
        • Negative keywords
          • Campaigns without negative keywords
          • Repeated ad group negative keywords
          • Shared negative keywords at the campaign level
          • Wrong use of modified broad
        • Search terms
          • Duplicate search terms
          • New keywords from search terms
          • Negative keywords from search terms
          • Negative keywords from n-gram search terms
        • Quality score
          • Ad groups with quality score issues
        • Placements
          • Placements with poor conversions
          • Mobile apps exclusion
          • Sensitive content label exclusion
        • Landing pages
          • Broken URLs
          • Broken DSA landing pages
          • Landing pages that need improvement
          • Landing pages with poor conversions
        • Ad extensions
          • Overview
          • Ad extensions — not fully approved
          • Ad extensions — disapproved
        • Bid suggestions
          • Bid suggestion alerts
          • Keyword bid suggestions
          • Keyword bids below the recommended minimum
          • Bidding change review reminders
          • Bid suggestions for product groups
        • Product groups
          • Underperforming product groups
        • PMax campaigns
          • Asset alerts
          • Asset group alerts
          • Search term alerts
    • Performance
      • Performance overview
      • KPI monitoring
      • Managing monitors in bulk
      • Root cause analysis (the Performance Analyzer)
      • Segments
        • Geo heatmap
        • Hour of the week
    • Budget
      • Budget overview
      • Management
      • Pacing
      • Automation
      • Insights
        • Budget boost alerts
      • Budget groups
      • Custom budget
    • Reporting
      • Automate your reporting
      • Reporting setup
      • Customize a template
      • Data sources
        • Google Ads
        • Google Analytics
        • Microsoft Ads
        • Facebook
      • Looker Studio FAQs
      • Why Looker Studio
    • Ad testing
      • Overview
      • Single ad group testing
        • Overview
        • Automatic pausing of losing ads
        • Automated vs manual testing
        • Change history
      • Multi ad group testing
        • Testing by text pattern
        • Testing by label
        • Testing by image size or ad label
        • Change history
      • FAQs
        • Ad testing thresholds
        • Ad testing metrics
        • Ad testing confidence
        • Performance boost projections
        • Date range used for ad testing
        • Ad enabled date
        • Daily test schedule
        • Ad testing notes
    • Bidding
      • Overview
      • Manage your bid adjustments
    • Quality score
      • Boost your quality score
  • Manage
    • Campaigns
      • Campaign settings
      • Performance Max optimization
      • Build campaigns from CSV files
    • Ads
      • RSAs
        • Overview
        • Manage your RSA assets
        • A/B testing of RSAs
        • A/B testing of RSAs in a set of ad groups
        • Add RSA headlines and descriptions
      • Expanded DSAs
        • Copy expanded DSAs
        • Create expanded DSAs
      • Call ads
        • Create call ads
      • ETAs
        • Convert ETAs to RSAs
        • Convert ETAs to RSAs (ad group level)
      • Pause ads in bulk
      • Create an ad schedule
    • Keywords
      • Overview
      • Match type changes
      • Negative keywords
      • Build campaigns from keyword lists
    • Search terms
      • Search terms and n-grams
      • Search term tools
      • PMax search terms
    • Placements
      • Optimize your placements
      • PMax
    • Landing pages
      • Analyze your landing pages
    • Labels
  • Billing
    • Billing
      • Multiple accounts
      • Invoices & payments
      • Account cancellation
  • product
    • Product updates
      • Q2 2025
      • Q1 2025
      • Q4 2024
      • Q3 2024
      • Q2 2024
      • Q1 2024
      • Unveiling the new Adalysis interface
    • Pricing
    • Demo
Powered by GitBook
On this page
Export as PDF
  1. Tools
  2. Ad testing
  3. Multi ad group testing

Testing by label

Split-test sets of ads across ad groups or campaigns based on labels

Last updated 2 months ago

Multi ad group testing using labels helps you to split-test sets of ads across multiple ad groups or campaigns. Labels allow you to include and exclude ads as needed.

This approach can be useful for low-volume accounts that don’t accrue enough data to reach statistical significance with .

How to run multi ad group tests with labels

Manual tests

You can run multi ad group tests as often as you need. Go to Ad testing > Multi ad group > Test by labels:

  1. Select your campaigns and the date range for this test. A common date range means Adalysis will calculate a date range during which all ads within the test were active.

  2. Specify the ad labels you want to use for comparing your ad performance. You can also exclude ads in the Label contains none column.

  3. Optional: Select the banner sizes you want to compare.

  4. Click Find current winners.

  5. If you want Adalysis to run this test automatically, click Save & run this test daily.

Please be careful when using a common test date range. If one matching ad out of hundreds was enabled only a few days ago, it will cause the whole test to run using only the last few days of data.

Daily test runs

Unlike single ad group testing, you'll need to define your multi ad group test before Adalysis can automate it for you. You can save your manual tests by clicking Save & run this test daily or choose Define an automated test.

Understanding the test result data

As well as the aggregate performance of each label (over the date range used), you'll also see the confidence the algorithm has in each metric. You'll also get performance boost projections based on pausing all ads with the losing label(s).

Look at the details of each label you included or excluded, and how many ad groups and ads it applied to. This includes the individual ads for each label. (Included labels are blue, while excluded labels are red.)

Bulk actions

You can pause all ads with the losing label(s) for a specific metric. Once you pause the ads, the test is marked as archived and you can always find it under the Archived test results tab.

single ad group testing