Official Whitepaper

The $100 Billion Measurement Illusion

ROAS vs. Incrementality in Digital Marketing: Why the metric most marketers trust is systematically misleading them.

SK
Simran Kohli Founder, EncubIQ
Published February 05, 2026
Reading Time 15 Minutes
The $100 Billion Measurement Illusion

Vol. 01 Edition

High-Stakes Performance Series

Executive Summary

"30–70% of platform-attributed conversions would have happened without any ads."

The metric most marketers use to evaluate advertising performance, Return on Ad Spend (ROAS), systematically overcounts conversions, misattributes credit, and drives billions in wasted spend. This whitepaper examines the evidence from hundreds of incrementality experiments and landmark case studies to reveal the true scale of this measurement gap.

Key Findings

  • check_circle Platform-reported ROAS can overstate true incremental value by 5–10x for certain channels, particularly branded search and retargeting.
  • check_circle Major brands including Uber, Airbnb, and P&G discovered hundreds of millions in wasted ad spend when they tested incrementality.
  • check_circle There is no consistent relationship between platform-reported ROAS and actual incrementality.

Why This Matters

As marketing budgets compress, falling from 11% of company revenue pre-pandemic to 7.7% in 2024, the pressure to prove genuine impact will only intensify. Brands that continue optimizing for ROAS are systematically misallocating capital. This whitepaper provides the framework and evidence to make the case for incrementality.

1. The ROAS Problem

1.1 How Platform Attribution Creates Illusions

ROAS (Return on Ad Spend) divides total revenue attributed to ads by the cost of those ads. The problem is the word "attributed." When a customer sees a Facebook ad on Monday, searches your brand on Google Tuesday, and buys Wednesday, both Meta and Google claim that sale. Your spreadsheet shows two conversions; your bank account shows one.

Study Finding Implication
Johnson, Lewis & Nubbemeyer (432 GDN experiments) Median lifts of 17% site visits, 8% conversions Platforms claim 5–10x actual impact in many cases, often reporting credit for activity that was not causally driven by the ad.
Marketing Science Institute (Facebook RCTs) Measurement errors: 115% upper-funnel, 62% lower-funnel Even "reliable" lower-funnel data is wrong by 62%
Stella 225-test benchmark No correlation between ROAS and incrementality ROAS cannot predict true value

1.2 The Mechanisms of Inflation

View-Through Attribution: Platforms claim credit when users merely saw an ad, even if it had no impact on their decision. A user who would have purchased anyway gets counted as an ad-driven conversion simply because an ad appeared somewhere in their browser.

Retargeting Overcounting: Retargeting systematically takes credit for high-intent shoppers who were already going to buy. Incrementality tests consistently show 50–70% of retargeted users would have converted without seeing a retargeting ad.

Cross-Platform Duplication: The same conversion gets claimed by multiple platforms simultaneously. A single purchase can be attributed to Meta, Google, TikTok, and email marketing — each taking full credit.

1.3 The iOS 14.5 Revelation

"$10 Billion: Meta's estimated revenue loss from iOS 14.5 privacy changes, revealing how fragile ROAS measurement always was."

Apple's iOS 14.5 privacy changes inadvertently exposed this fragility. Successful device-level attribution on iOS dropped to just 6.5% of pre-ATT levels. The numbers marketers had trusted for years simply vanished, not because advertising stopped working, but because the measurement was never as solid as it appeared.

1.4 A Note on Platform Accountability

The industry has seen increasing scrutiny of platform measurement practices. While platforms maintain their measurement methodologies are sound, the consistent gap between platform-reported metrics and independent incrementality tests suggests marketers should not rely on platform attribution as their sole source of truth.

2. What Happens When Brands Test Incrementality

The most compelling evidence against ROAS-based decision-making comes from major brands that paused their advertising and measured what actually happened.

2.1 The eBay Experiment

eBay was spending $51 million annually on paid search. Researchers paused brand keywords on Yahoo! and Microsoft. The result: 99.5% of the traffic from paid brand ads reached eBay through organic search anyway. Brand keywords delivered zero measurable short-term benefit.

2.2 Uber: $135 Million in Discovered Waste

"We turned off two-thirds of our ad spend and basically saw no change in our number of rider app installs... total waste discovered: $135 million."

- Kevin Frisch, Former Head of Performance Marketing, Uber

2.3 Airbnb: The COVID Revelation

When COVID-19 forced Airbnb to cut total marketing spend by 58% in 2020, reducing performance marketing to near-zero, traffic recovered to 95% of 2019 levels without any marketing spend.

Want the raw data backing these studies?

Our full report includes the detailed incrementality benchmarks for every major ad channel.

Airbnb had spent $1.14 billion on marketing in 2019, with over half a billion going to performance marketing that produced essentially no incremental outcome. The company permanently shifted strategy toward brand marketing, and by Q4 2021 reported its strongest-ever results.

2.4 Procter & Gamble and JPMorgan Chase

P&G cut $200 million in digital ad spend in 2017. The result: reach increased 10%, and the company posted a 15% rise in net earnings ($2.3 billion). JPMorgan Chase reduced its programmatic ad placements from 400,000 websites to just 5,000, a 98.75% reduction, with no deterioration on performance metrics.

3. How Incrementality Measurement Works

Incrementality answers one fundamental question: "Would this conversion have happened without the ad?"

3.1 The Core Distinction

Metric Formula What It Measures
ROAS Total attributed revenue ÷ Ad spend Correlation (who saw ads before buying)
Incremental ROAS (iROAS) Revenue that would NOT have occurred without the ad ÷ Ad spend Causation (what the ad actually caused)

3.2 Measurement Methodologies

Randomized Controlled Trials (Lift Studies): The gold-standard. A target audience is randomly split: test group sees ads, control group does not. The difference in conversion rates reveals the true lift.

Geo-Experiments: Markets are divided into test and control regions (e.g., using Google's CausalImpact or Meta's GeoLift). This works across all channels, including offline media like TV.

Media Mix Modeling (MMM): Uses aggregate time-series data to separate baseline sales from marketing-driven sales, providing a cross-channel strategic perspective.

3.3 The Measurement Triad

The emerging industry consensus is a "triad" approach: Multi-touch attribution for tactical signals, Media Mix Modeling for strategic planning, and incrementality testing for causal ground truth that calibrates both.

A Note on Lead Quality: Incrementality isn't just about quantity; it's about quality. Conversions that were "bound to happen" often represent your highest-value existing customers. Advertising to them doesn't create growth; it merely subsidizes transactions that were already occurring.

lock

Unlock the Full Framework

Get the detailed Channel-by-Channel iROAS benchmarks, the Industry Waste tables, and our step-by-step Implementation Guide.

Free for marketing leaders.

7. Conclusion

ROAS-optimized portfolios systematically overfund low-incrementality channels while starving the prospecting and awareness channels that actually drive new revenue. The brands that measure what actually works will outspend competitors not by spending more, but by wasting less.


8. Selected References

1. Blake, T., Nosko, C., & Tadelis, S. (2015). Consumer Heterogeneity and Paid Search Effectiveness. Econometrica.

2. Johnson, G. A., Lewis, R. A., & Nubbemeyer, E. I. (2017). Ghost Ads: measuring Online Ad Effectiveness. Journal of Marketing Research.

3. Association of National Advertisers (ANA). (2023). Programmatic Media Supply Chain Transparency Study.

4. Gartner (2024). CMO Spend Survey.

5. Stella. (2025). Incrementality Benchmarks Report.

6. Haus. (2025). Meta Incrementality Analysis.

Build a Framework for Actual Growth

Stop optimizing for platform metrics. We help businesses implement the measurement triad: Attribution, MMM, and Incrementality testing.