Every marketing dashboard tells the same story: campaigns are working, ROAS looks healthy, the numbers are green. But there's a question most businesses can't answer honestly: Would those customers have bought anyway?
The uncomfortable truth, backed by hundreds of experiments: 30–70% of platform-attributed conversions would have happened without any ads at all.
This isn't a rounding error. It's one of the largest inefficiencies in modern business, and it's why companies like Uber, Airbnb, and P&G have fundamentally restructured how they measure marketing.
The Problem With Platform-Reported ROAS
ROAS, Return on Ad Spend, divides revenue attributed to ads by ad cost. Simple enough. The problem is the word attributed.
When a customer sees a Facebook ad Monday, searches your brand on Google Tuesday, and buys Wednesday, both Meta and Google claim credit for that sale. Your spreadsheet shows two conversions. Your bank account shows one.
This isn't theoretical. A meta-study of 432 display ad experiments found platforms claimed credit for vastly more activity than they actually caused, with median actual lifts of only 17% in site visits and 8% in conversions. A Marketing Science Institute study found measurement errors of 62–115% depending on funnel stage.
Mechanism of Distortion
- View-through attribution gives credit when users merely saw an ad, whether or not it influenced them.
- Retargeting takes credit for high-intent shoppers who were already going to buy. Studies consistently show 50–70% of retargeted users would have converted anyway.
- Multi-platform overlap means the same conversion gets claimed by multiple channels simultaneously.
Apple's iOS 14.5 privacy changes inadvertently revealed how fragile this measurement always was. Successful attribution dropped to 6.5% of pre-ATT levels. The numbers marketers trusted for years simply vanished, not because advertising stopped working, but because the measurement was never as solid as it appeared.
What Happens When Brands Actually Turn Off the Ads
The most compelling evidence against ROAS-based decision-making comes from companies that paused their advertising and watched what happened.
"We turned off two-thirds of our ad spend and basically saw no change in our number of rider app installs... a lot of installs we thought had come through paid channels suddenly came through organic."
— Kevin Frisch, Former Head of Performance Marketing, Uber
Uber discovered that $100 million of its $150 million annual digital ad budget was wasted. They also ran incrementality tests on Meta ads and found they weren't bringing in new riders, saving an additional $35 million. Total waste eliminated: $135 million.
Airbnb's Revelation
Airbnb experienced an even more dramatic revelation. When COVID forced them to cut marketing spend by 58%, reducing performance marketing to near-zero, traffic recovered to 95% of 2019 levels without any marketing spend. CEO Brian Chesky stated plainly: "The pandemic showed us we could reduce marketing cost and still maintain most of our traffic."
Airbnb had spent over half a billion dollars on performance marketing that produced essentially no incremental outcome. By Q4 2021, after permanently shifting strategy toward brand marketing, the company reported its strongest-ever results.
The eBay Experiment That Shook the Industry
The most influential study on ROAS misattribution remains the eBay experiment, published in Econometrica in 2015. eBay was spending $51 million annually on paid search. Researchers ran a controlled experiment, pausing brand keyword ads on Yahoo! and Microsoft.
The result was devastating: 99.5% of the traffic from paid brand ads reached eBay through organic search anyway. Zero measurable short-term benefit.
This finding has been reinforced by recent data. The largest public incrementality benchmark study (225 tests, 2024–2025) found Google branded search delivered a median incremental ROAS of just 0.70x, the lowest-performing channel tested. Meanwhile, separate analysis found branded search reported platform ROAS of $11.20 while delivering actual incremental ROAS of just $2.20, a 5x overstatement.
What Incrementality Actually Measures
Incrementality answers one question: "Would this conversion have happened without the ad?"
Unlike attribution, which traces paths and assigns credit (correlation), incrementality establishes causation through controlled experimentation:
Platform ROAS
Total attributed revenue ÷ Ad spend
Incremental ROAS
Revenue ONLY from the ad ÷ Ad spend
The gold-standard methodology is the randomized controlled trial. A target audience is split: the test group sees ads, the control group doesn't. The difference reveals the true incremental lift. Meta, Google, and TikTok all offer these studies within their platforms.
The most important finding: there's no consistent relationship between platform-reported ROAS and actual incrementality. ROAS isn't just inflated; it's uncorrelated with the metric that actually matters.
Where the Gap Is Widest
The pattern across multiple studies is consistent:
- Prospecting and upper-funnel channels are systematically undercredited
- Retargeting and branded search are systematically overcredited
Facebook retargeting shows reported ROAS of $5.30 versus actual incremental ROAS of just $1.75, a 3x overstatement. Meanwhile, prospecting delivers +240% more incremental revenue per dollar than retargeting.
What This Means for Your Business
The scale of the measurement problem is staggering. Industry estimates suggest 30%+ of digital ad spend is wasteful or unproductive.
Avinash Kaushik, formerly Google's Head of Strategic Analytics, has been blunt: "ROAS is a navel-gazing advertising-centric metric. It is not a business metric."
The Practical Path Forward
- Run holdout tests on your highest-spend channels, especially branded search and retargeting, where the gap is widest.
- Expect surprises: upper-funnel channels are likely undervalued, retargeting is likely overcredited.
- Align your organization: shifting from flattering ROAS dashboards to sobering incrementality data requires executive buy-in.
As marketing budgets compress, the pressure to prove genuine impact, not just attributed credit, will only intensify. The brands that measure what actually works will outspend competitors not by spending more, but by wasting less.