Google Ads is probably your highest-ROAS channel. It says so right there in the dashboard — branded search returning 10x, Performance Max at 5x, the whole account looking like a money printer.
The problem is that Google is grading its own homework, and the methodology guarantees a good grade. That doesn't mean Google Ads isn't working. For most brands, it is. But the numbers Google shows you are structurally inflated in ways that distort your budget decisions — and unlike Meta's measurement problems, which have gotten a lot of attention since iOS 14, Google's inflation flies under the radar because the numbers look so good that nobody questions them.
This guide covers exactly how Google Ads overstates its contribution, what the reported ROAS actually measures, and how to independently verify whether your Google spend is driving real incremental revenue. (For the Meta side of this equation, see our guide on how to measure Meta Ads ROI. For a broader overview of measurement approaches, see MMM vs. attribution vs. incrementality testing.)
The branded search problem is worse than you think
This is the single biggest source of inflated ROAS in Google Ads, and most advertisers either don't know about it or actively ignore it because the numbers feel too good to give up.
Here's what happens. Someone sees your Meta ad, your TikTok video, or hears about you on a podcast. They get interested. Instead of clicking the ad, they go to Google and type your brand name. Your branded search ad appears at the top. They click it. They buy.
Google Ads takes full credit for that sale. ROAS looks incredible — often 8x, 10x, even 20x on branded campaigns. But the ad didn't cause the purchase. It intercepted a customer who was already coming to buy. The demand was created somewhere else entirely.
The proof is straightforward. If you pause branded search ads for two weeks, most of those clicks shift to your organic listing directly below. Google's own research has acknowledged this — incremental click studies consistently show that a large percentage of paid branded clicks would have happened organically. For many brands, the incremental lift from branded search ads is 10-20% of what Google's last-click attribution claims.
This means a branded search campaign reporting 15x ROAS might have a true incremental ROAS closer to 2x-3x. That's still profitable, but it completely changes how much you should be spending and whether that budget would be better deployed elsewhere. For a deeper dive on how this affects ecommerce brands specifically, see our DTC MMM guide.
Performance Max is a black box by design
Performance Max campaigns combine Search, Shopping, Display, YouTube, Discovery, and Gmail into a single campaign type that Google optimizes automatically. The pitch is that Google's AI finds the best placements for your budget. The measurement problem is that you can't see where your money actually went.
Google reports aggregate ROAS for the entire Performance Max campaign, but the breakdown across placements is limited. Your 5x ROAS might be 15x on branded Shopping (intercepting existing demand, same problem as branded search), 2x on non-branded Search, and 0.3x on Display — but Google blends these into a single number that looks great.
The opacity is the point. If you could see that Display placements within Performance Max were running at negative ROI, you'd exclude them. Google doesn't want that — Display inventory needs buyers, and bundling it with high-performing Search and Shopping placements makes it palatable.
For MMM purposes, Performance Max creates a specific challenge: your Google spend is a single line item that mixes high-intent search with low-intent display. When your MMM shows a Google ROAS of 3x but your Google dashboard says 6x, part of that gap is the branded search inflation and part is the Display inventory within PMax dragging down the true average while branded search pulls up the reported average.
Conversion tracking inflation is baked into the system
Beyond branded search and PMax opacity, Google's conversion tracking itself has structural biases.
Data-driven attribution spreads credit generously. Google's default attribution model gives partial credit to every Google touchpoint in the conversion path. If someone saw a Display ad, clicked a non-branded search ad, and then converted through branded search, all three Google touchpoints get credit. But none of the non-Google touchpoints that influenced the journey — the podcast ad, the friend's recommendation, the Meta retargeting — exist in Google's data. Google's attribution model is comprehensive within Google and completely blind outside of it.
Conversion windows are long. The default click-through window is 30 days for many conversion actions. Someone clicks a Google ad on April 1st, does nothing, then returns directly on April 28th and buys. Google claims the conversion. Whether that click was actually the reason they came back a month later is unknowable — but Google counts it.
Enhanced conversions model what they can't track. Similar to Meta's modeled conversions, Google uses first-party data and modeling to estimate conversions it can't directly observe. This improves accuracy compared to having no signal at all, but it's still Google estimating Google's value using Google's methodology.
What Google ROAS actually tells you
Like Meta ROAS, Google ROAS is useful as a relative signal within Google's ecosystem. Campaign A at 6x ROAS is almost certainly outperforming Campaign B at 2x ROAS within Google Ads. The inflation affects both similarly, so relative rankings are meaningful.
Where it fails is exactly where most brands rely on it: absolute budget decisions and cross-channel comparisons. When Google says 6x and Meta says 3x, you cannot conclude Google is twice as effective. Google's number includes branded search inflation, long conversion windows, and data-driven attribution spreading credit across Google's own touchpoints. Meta's number includes view-through inflation and modeled conversions. They're measuring different things on different scales.
How to independently measure Google's actual contribution
Three approaches, same framework as measuring Meta — because the principle is identical: don't let any platform grade its own homework.
1. Marketing mix modeling. MMM estimates Google's contribution by examining the statistical relationship between your Google spend and your total sales over time. It doesn't use Google's conversion tracking at all. It just asks: when Google spend went up, did total sales go up proportionally, after accounting for other channels and seasonality?
For most brands, MMM produces a Google ROAS that's significantly lower than what Google Ads reports — often 40-60% lower. The gap is mostly explained by branded search inflation. When the model separates the effects of all your channels simultaneously, it recognizes that branded search volume is driven by your other marketing. The spend you're putting on branded search ads is largely intercepting demand that other channels created.
You can run this analysis for free with CheapMMM. Upload a CSV with weekly revenue and weekly spend per channel. If you can break Google spend into branded vs. non-branded, do it — that separation alone produces dramatically more useful output. For guidance on structuring your data, see our data preparation guide.
2. Branded search holdout test. Before running a full geo holdout, start with the test that produces the single most valuable insight per unit of effort: pause your branded search campaigns for two weeks and measure what happens.
Track three things during the test: total branded clicks (paid + organic combined), total conversions from branded traffic, and total revenue. If total branded conversions barely move — with organic picking up most of the paid clicks — then your branded search spend was mostly intercepting organic demand. If conversions drop meaningfully, branded search is doing real work.
This test terrifies most paid search managers because it means watching Google Ads ROAS collapse temporarily. But the question it answers — "am I paying for clicks I'd get for free?" — is worth more than any dashboard metric.
3. Geo holdout incrementality test. The most rigorous approach. Suppress all Google Ads (or specific campaign types) in a set of test markets for 4-8 weeks while keeping them running in control markets. Compare sales between groups. The difference is Google's causal contribution.
For non-branded search and Shopping, geo holdouts typically confirm that Google is contributing real incremental value — often 60-80% of what Google's attribution claims. For branded search, the incremental contribution is usually far lower than reported. The combined picture is a Google Ads ROAS that's real and positive, but not as high as the dashboard suggests.
The Google-Meta interaction most brands miss
Google and Meta don't operate independently, and measuring either one in isolation gives you the wrong answer for both.
The most common pattern: Meta prospecting creates awareness. A percentage of those people go to Google and search your brand name. Google's branded search ad captures the click and takes credit. In Google's reporting, branded search looks like your best-performing campaign. In Meta's reporting, those conversions don't exist — Meta lost tracking at the brand search step.
An MMM surfaces this interaction by looking at all channels simultaneously. When Meta spend increases and branded search conversions increase in the same periods, the model recognizes the correlation and attributes the demand creation to Meta rather than to the branded search ad that intercepted it.
The practical implication is that brands who evaluate Google and Meta independently — using each platform's own reporting — systematically overspend on branded search and underspend on prospecting. MMM corrects this by providing a single, cross-channel view where each dollar of spend is attributed once. For a detailed walkthrough of what to do with these numbers, see our guide on how to interpret MMM results.
What to do with Google's numbers in the meantime
Use Google ROAS for within-platform optimization. Which campaigns, ad groups, and keywords are performing best relative to each other? Google's reporting is fine for this. Use it to manage bids, test ad copy, and allocate within your Google budget.
Break out branded vs. non-branded. This is the single highest-leverage change you can make to your Google Ads reporting. When branded and non-branded are combined, branded search's inflated ROAS pulls up the average and masks the true performance of your non-branded campaigns. Separate them in both your Google reporting and your MMM input data.
Don't use Google ROAS for cross-channel decisions. Moving budget from Meta to Google because Google reports higher ROAS is one of the most common and most expensive mistakes in digital marketing. The ROAS numbers aren't comparable. Use an MMM for cross-channel allocation.
Question Performance Max ROAS. If PMax is your highest-ROAS campaign, ask yourself what percentage of that is branded Shopping and branded Search — demand that would have converted anyway. If you can't answer that question from Google's reporting (and you usually can't), that's the problem.
A realistic framework for Google Ads measurement
Use platform reporting daily for campaign management. Trust the relative rankings within Google. Don't trust the absolute numbers for budget decisions.
Run an MMM quarterly with CheapMMM to get an independent view of Google's actual contribution vs. other channels. Feed it weekly data with Google spend broken into branded and non-branded if possible. Compare the MMM ROAS to Google's reported ROAS — the gap tells you how much inflation is in your current numbers.
Run a branded search holdout test at least once. The answer to "what happens when I pause branded search" is the single most valuable data point in your entire measurement practice. Most brands discover they can cut 30-50% of branded search spend and reinvest it in channels that create demand rather than intercept it.
This isn't about distrusting Google. It's about recognizing that every platform has a structural incentive to overcount its own contribution — and building a measurement practice that accounts for that.