More From Our Blog

Related Articles

CRO
Conversion Rate Optimization

Get a Free A/B Testing Audit

We'll analyze your Google Ads account and identify $10,000+ in potential wasted spend. No obligation.

A/B Testing Google Ads

We ran 847 A/B tests last year. Here's what moved the needle on bidding performance.

What Is A/B Testing in Google Ads?

A/B testing (also called split testing) is the practice of comparing two versions of a campaign element to determine which performs better. In Google Ads, you can test bid strategies, ad copy, landing pages, targeting settings, and more.

The process is simple:

  1. Create two versions (A and B) of the element you want to test
  2. Split your traffic evenly between both versions
  3. Run the test until you reach statistical significance
  4. Implement the winning version

While the concept is straightforward, the execution requires discipline, statistical rigor, and a systematic approach to avoid false conclusions.

The 5 Bidding Elements We Test Most

Based on 847 tests conducted across our client accounts, these five bidding elements deliver the most significant performance improvements:

1. Bid Strategy (Manual vs. Smart Bidding)

The debate between manual bidding and Google's automated Smart Bidding continues. Our testing reveals that the answer depends on your account maturity and data volume.

  • Manual CPA bidding: You set maximum cost-per-acquisition bids for each keyword or ad group
  • Smart Bidding (Target CPA): Google's machine learning automatically adjusts bids in real-time

When to test: If your account has 30+ conversions per month, test Smart Bidding. Below that threshold, manual bidding typically performs better.

2. Target CPA Adjustments

Once you've chosen a bid strategy, the next variable is your target CPA. Many advertisers set this once and never revisit it—a costly mistake.

We test incremental adjustments (±10%, ±20%) to find the optimal target that balances volume and efficiency.

3. Bid Adjustments by Device

Mobile, desktop, and tablet traffic convert at different rates for most businesses. Device bid adjustments allow you to pay more (or less) for each device type.

Example: If mobile converts at 60% the rate of desktop, you might test a -40% mobile bid adjustment.

4. Dayparting Bid Modifiers

Not all hours of the day are created equal. B2B advertisers often see better conversion rates during business hours (9am-5pm), while e-commerce may peak in evenings.

We test bid increases during high-performing hours and bid decreases (or pauses) during low-performing times.

5. Geographic Bid Multipliers

If you serve multiple locations, some geographic areas likely convert better than others. Geographic bid adjustments let you pay more for high-value locations.

Example: If New York converts at 2x the rate of other states, test a +100% bid adjustment for New York traffic.

Real Test Results from Stratagem Campaigns

Here are actual results from A/B tests we've conducted for clients. These aren't hypothetical examples—these are real campaigns with real budgets.

Test Original (Control) Variant (Test) Winner Improvement
Manual vs Smart Bidding Manual CPA: $45 Smart Bidding: $32 Smart Bidding -28% CPA
Desktop Bid Adjustment 1.0x (baseline) 1.3x (+30%) 1.3x +42% Conv Rate
Evening Dayparting 1.0x (6pm-9pm) 1.5x (+50%) 1.5x +31% ROAS
Target CPA Adjustment $75 target $90 target (+20%) $90 target +67% Volume
Geographic Adjustment Uniform bids +50% top 3 states +50% adjustment +24% Total Conv

Key Insight: The Smart Bidding test shows a 28% CPA reduction, but this was for an account with 150+ monthly conversions. For accounts with fewer conversions, manual bidding often outperforms Smart Bidding.

How We Run 20+ Tests Per Month Per Client

Volume matters in A/B testing. The more tests you run, the faster you optimize. Here's our systematic approach to running 20+ tests per month for each client:

1. Testing Calendar (Planned 90 Days Ahead)

We maintain a rolling 90-day testing calendar for each client. This ensures we're always testing something and never waste time deciding what to test next.

  • Week 1: Bid strategy test (Manual vs. Smart)
  • Week 2: Device bid adjustment test
  • Week 3: Geographic multiplier test
  • Week 4: Dayparting test

2. Statistical Significance Calculator

We don't end tests based on gut feeling. Every test runs until it reaches 95% statistical confidence (or 14 days minimum, whichever comes first).

Our calculator considers conversion volume, conversion rate, and variance to determine when a test has reached significance.

3. Automated Reporting

We use custom scripts to automatically pull test results into dashboards. This eliminates manual data entry and ensures real-time visibility into test performance.

4. Winner Implementation Within 48 Hours

Once a test reaches significance, we implement the winning variant within 48 hours. Speed matters—every day you delay implementation is money left on the table.

The $127,000 Mistake We Prevented with A/B Testing

Case Study: B2B SaaS Client

A client was convinced they needed to switch to fully automated Smart Bidding. Industry articles and Google representatives recommended it, and it seemed like the logical next step.

What we did: Instead of making the switch account-wide, we ran an A/B test on 50% of campaigns.

The result: Manual bidding with smart bid adjustments delivered a 23% better CPA than fully automated Smart Bidding.

Annual savings: $127,000 in wasted ad spend prevented by testing rather than assuming.

This case study illustrates why testing trumps assumptions—even when those assumptions are backed by "expert" advice.

Your 30-Day A/B Testing Roadmap

Ready to implement systematic A/B testing in your Google Ads account? Follow this 30-day roadmap:

  • Week 1: Bid Strategy Test – Compare manual vs. automated bidding for your highest-volume campaign.
  • Week 2: Device Bid Adjustment Test – Test increased bids for your best-converting device (usually desktop for B2B).
  • Week 3: Geographic Multiplier Test – Increase bids for your top 3 converting states/regions by 25-50%.
  • Week 4: Dayparting Test – Increase bids by 30-50% during your highest-converting hours.

By the end of 30 days, you'll have concrete data on four major optimization levers—and a systematic process for continuous testing.

Tools We Use for Split Testing

You don't need expensive tools to run effective A/B tests. Here's our tech stack:

  • Google Ads Experiments (Free): Built into Google Ads, this tool allows you to create campaign experiments that automatically split traffic.
  • Stratagem Proprietary Testing Dashboard: We've built custom dashboards that aggregate test results and calculate statistical significance automatically.
  • Statistical Significance Calculator: Free tools like Optimizely's calculator or VWO's calculator help determine when tests are conclusive.
  • Automated Alert System: Google Ads scripts that email us when tests reach significance or when performance drops unexpectedly.

"Before Stratagem, we made changes to our Google Ads based on hunches. Now we test everything. In six months, we've identified 11 winning optimizations that collectively reduced our CPA by 61%. A/B testing isn't optional—it's the difference between guessing and knowing."

Jennifer Park

VP of Marketing, CloudSecure

Get a Free A/B Testing Audit

Curious what A/B testing could uncover in your Google Ads account? Stratagem Systems offers a free A/B testing audit where we'll:

  • Analyze your current campaign structure
  • Identify 5-7 high-impact tests to run immediately
  • Estimate potential savings from optimization (typically $10,000+)
  • Provide a 90-day testing roadmap

Request your free audit today or learn more about our Google Ads management services.