• DE&I Commitment
  • Careers
  • Locations
  • Book a Meeting
    Book a Meeting
  • Company
    Learn a little more about us, our values, and our team
    Our Values
    We don't just talk the talk; we live by our core values
    About Us
    Empowering brands to realize their potential with data, insights, and technology
    Leadership
    Meet our leadership team
    Secure Data Architecture
    Our promise of data security and privacy. We keep your data safe from publishers, competitors and bad actors
    Corporate Social Responsibility
    We are committed to making a positive impact on our communities and our planet
  • Platform
    Plan, manage, optimize, and measure your campaigns with our omnichannel platform
    Our Platform
    A platform that connects all walled garden media
    Connected Media
    Create and manage campaigns across search, social, retail media and apps, in one platform
    Connected Data
    Make data-driven decisions as you plan and strategize
    See the industries we serve
    Learn how our customizable solutions can help with your unique needs
    Explore our partner integrations
    See the media, retailer, and data partners we work with
  • Clients
  • News and Events
    Check out recent announcements and see what we’re up to
    News
    Check out our recent media coverage
    Events
    Join us for our next conference or webinar
    Roundtable discussion: Beyond Onsite
    Tackle the next phase of retail media over breakfast with Nectar360 and peers
    Retail Media for Grocery Webinar
    Join us to dig into trends and challenges shaping the industry in 2023
  • Resources
    From new releases, to industry trends and best practices, Skai has you covered
    Blog
    Read the latest insights and thought leadership from our industry experts
    Capabilities
    Take your campaigns to the next level by enhancing your platform capabilities
    Research
    Explore our reports and whitepapers so you can keep up on the latest industry trends
    Subscribe
    Sign up to get the latest updates straight to your inbox
    Quarterly Trends Report
    Learn digital advertising campaign performance trends from Q4 2022
Back to Blog

Incrementality Testing & Marketing: A Skai Perspective

Skai™

February  04, 2019

Incrementality Testing: A Solution for Marketing Measurement

Publishers are always launching new campaign types, optimizations, ad formats, and targeting capabilities. Advertisers have an unprecedented opportunity to convey messages to their audiences. Yet, they face growing pressure to quickly identify the methods that drive business goals. Could incrementality testing be the next big solution to marketing measurement?

Differing methods of attribution and measurement add complexity. Varying indicators of value and revenue can tell conflicting stories based on sourcing. How can brands make informed, data-driven decisions to forecast the impact of different advertising investments on their business?

While brands seek more precise methods of proving efficacy, incrementality testing has become one of the hottest topics in online advertising. Skai has insights gleaned from over five years worth of incrementality measurement across platforms.

How Best To Measure Efforts?

A prime example of the issue facing marketers is when they have to understand the effect of retargeting strategies. Or how industry-wide buzz around video ads makes marketers vigilant to the revenue driven by their investment in video creation.

Similarly, the rise of mobile advertising has inspired skepticism that many attribution tools, in fact, misconstrue these investments. Advertisers are reasonably tasked with determining the incrementality of new and emerging channels against tried-and-true methods. However, brands remain uncertain of the true value of different targeting strategies, channels, and ad formats.

To date, the ad industry has been unable to provide clarity. The problem is exacerbated by the lack of a unifying approach to measurement. Among these challenges are walled-garden data sources, upper-funnel metrics, cross-device approximation, and online vs. offline activity. Marketers often report having trouble measuring the business impacts of specific tactics. As with any measurement modeling, data quality can make or break results.

Find the Business Impact

Incrementality testing in marketing compares the marketing results between a test group and a control group. Using this method, marketers can easily isolate the affected variables. They are able to clearly, assess immediate business impact and formulate data-driven actions to take.

With incrementality testing, advertisers can better understand if the KPIs are a direct result of their own campaigns or extraneous effects. Advanced AI technology helps these tests achieve statistical significance even if they are low volume, small budget campaigns.

Also, a cross-channel testing methodology provides measurement and insights into the indirect effects of the tactic. This is what is generally referred to as the ‘halo effect’. Incrementality testing can easily track business implications using consistent, comparable and coordinated data sources.

Incrementality testing is something that Skai’s data science team has developed over time. Our methodology leveraging best practices for testing adapted to a variety of specific lift-measurement use cases.

Our team consists of data scientists knowledgeable across verticals, with experience scaling incrementality testing across platforms and strategies. If your team faces challenges related to testing, attribution, or measurement, we’re eager to help and would love to connect!

Incrementality Testing: 5 Gaps of Attribution Measurement

Attribution uses models to measure advertising investments. A huge advantage of attribution modeling is data continuity, meaning that a brand doesn’t need to alter marketing plans mid-stream in order to achieve actionable insights. In fact, a well-formulated model can account for regular fluctuations and realignments. Data can be narrowed down to individual keywords and ads, and even cross-channel consumer journeys.

While attribution has proven a largely scalable solution across verticals, it also poses a new set of reporting challenges.

Incrementality Testing vs Attribution #1 – Attribution is only as good as the data

Advertisers must be able to account for the full consumer journey in order to track ads and their impact. But consumer journeys can be difficult to follow, especially those which occur across multiple channels and devices!

An ideal attribution model identifies individual users across devices and channels, but in cases when this isn’t easily accomplished, data quality is compromised. When the measurement is questionable, modeling is inaccurate at best.

Many advertisers have reason to believe that mobile advertising is, in fact, more effective than their reporting indicates. But it can be undervalued due to insufficient device tracking. Linear measurement often overlooks a large volume of conversions which begin on mobile but finish on desktop devices. This imprecision is particularly evident with mobile-heavy publishers such as Facebook, Snap, Pinterest, and Twitter. Similarly, a lack of sufficient data to measure video impact means that video promotions are regularly under-valued.

In those scenarios, incrementality testing proves particularly valuable by focusing on individual investments. It can directly measure the impact upon overall business results and eliminate the need to identify and measure each step in the consumer journey.

Incrementality testing accomplishes cause-and-effect measurement and takes out the guesswork.

Incrementality Testing vs Attribution #2 – Attribution models are subjective and non-transparent

Any path-to-conversion with multiple steps can create uncertainty regarding how and how much each step contributed to the final purchase. Competing attribution models paint conflicting stories about how each step ties to the end action.

Today, many companies even recruit AI teams to develop in-house attribution models based on machine learning. While this can result in greater efficiency and robust data sets, it leads to even more subjective models.

As advertisers seek the newest and most advanced attribution modeling available, old modeling quickly becomes outdated and redundant. In recent years, the breadth of attribution models available has given data teams additional challenges. Marketers struggle to coordinate disparate reporting, even among individual industries, companies, teams, and time periods.

In instances where attribution’s subjectivity is a cause for concern, incrementality testing lends additional assurance. By testing a specific investment in the customer journey, incrementality testing can measure its direct impact. Incrementality testing can also measure halo effects on the ecosystem of investments without making assumptions.

Incrementality Testing vs Attribution #3 – Attribution is not an island

A key benefit of attribution is that it accounts for ongoing business fluctuations. This can happen during big budget changes, holidays, and special events. This holistic analysis, assessing an array of inputs, leaves room for ambiguity regarding the actual value of individual investments. It tends to mistake correlation with causation. For companies impacted by seasonality, advertisers must determine when growth is resultant of their campaigns or external factors.

This gap is frequently manifested in upper-funnel actions and initial user interactions. For instance, attribution often credits a high volume of impressions to display ads. In 2017, The New York Times reported that Chase saw almost indistinguishable results between two very different promotions. One served display ads on 400,000 websites and another served ads on only 5,000 sites! That fact that a campaign 400,000 sites could drive the same goals as 5,000 suggests a disconnect in the actual effect of those display ads served.

chase incrementality

Incrementality testing in marketing can help combat ambiguity by isolating individual investments and standardizing extraneous parameters. This measurement is agnostic of influences such as seasonality, geography, and cross-marketing.

Incrementality Testing vs Attribution #4 – Attribution falls short with traditional media

Successful attribution requires comprehensive data for engagement and conversion actions. Funnels with offline components—especially traditional formats such as TV, radio, and billboards—are complicated by the inability to measure impressions and engagement. In these cases, extractable statistical insights cannot be conclusively connected to specific consumers. Isolating offline promotions’ true impact is virtually impossible.

This gap often means that conversions are inadequately attributed to offline ads, undermining the effect of offline channels. In this case, incrementality testing enables advertisers to measure the difference between a test group, which has been exposed to the offline ad, and a control group that has not.

Incrementality Testing vs Attribution #5 – Attribution falls short with future investments

Attribution modeling attempts to identify causality between conversion actions and preexisting investments. However, with the addition of new publishers and advertising methods to the consumer funnel, marketers need to consider the value of new investments. Attribution models are historical in nature and therefore lack sufficient data to perform these calculations. Advertisers are left to make investments and measure the impact afterward rather than in advance.

In 2017, Skai launched full support for Pinterest campaigns. While some advertisers adopted immediately, others questioned the value of investing in a new channel. But by using small test budgets, clients were able to quantify the effect of introducing a new platform into their consumer journeys.

Using an incrementality test, Skai clients Belk and iCrossing determined that Pinterest advertising increased their online ROAS 2.9x and in-store ROAS 31.4x!

Skai Incrementality Testing

Incrementality testing can be used to evaluate future investments. It can use small budgets to assess the value of the new investment with statistical confidence.

Conscious of these gaps, advertisers tasked with making decisions based on attribution face even more questions. How can attribution be directly translated to user actions? Can models be adjusted during the course of a promotion? Can advertisers determine which scenarios justify switching models?

We have seen a shift over the last three years in the way marketers evaluate attribution modeling and results. A decade ago marketers considered full funnels measurable and data accurate and actionable. However, today they feel comparably limited, with mobile, in particular, disrupting measurement and necessitating new solutions. Proactive advertisers today will often combine attribution with incrementality testing. Marketers can use incrementality testing to validate attribution measurement and adjust modeling to better measure true ad performance.

Incrementality Testing: Selecting the Right Split

The three stages of an incrementality test are:

  1. Preparation – Split some part of my addressable market into A & B groups
  2. Intervention – Expose one of the groups to a new variable allowing enough time for any difference to become apparent
  3. Measurement – Examine the performance of groups A and B pre- and post-intervention to understand the impact

Incrementality Test vs. A/B Testing: The Crucial Difference

incrementality vs a/b

So far, you might be thinking that this all sounds similar to traditional A/B testing. It can test things like subject lines, images, or landing pages to see which variant performs best.

But the truth is, an incrementality test is very different.

In incrementality testing, we measure the impact of the test on business-level metrics such as revenue, new customers, or site visitors. In traditional A/B tests, it’s often about media optimization. We look for the impact on more specific campaign performance metrics such as CTR, attributed conversion rate, etc. However, we cannot rely on attribution to understand the business-level impact. The performance data we use to measure impact in an incrementality test must not rely on attribution.

Measuring impact on business metrics also means we have to be very thoughtful as to how we set up our split test. In an incrementality test, the split into groups A and B should be done in such a way that the intervention performed on one group will have little to no impact on the other group. If not, the results of the test may totally miss or greatly exaggerate the impact of the intervention. In other words, you need a clean split with minimal crossover.

Let’s examine three of the most common split types used in traditional A/B testing. Which, if any, are appropriate for an incrementality test?

  • Auction split
  • Audience split
  • Geo split

For each type of split we’ll evaluate it against the criteria mentioned above as well as more general criteria that are important for all kinds of A/B test splits:

  • Ability to measure impact without relying on attribution
  • Ability to intervene in one group without impacting the other
  • Good correlation between performance metrics in both groups
  • The randomness of the split
measurement splits for incrementality testing
Auction Split

How it works: An auction-based split randomly assigns a user to group A or B in real-time, i.e. when they are about to be exposed to an ad.

Pros: Theoretically, this allows for a totally random split. This is ideal from a statistical point of view and should lead to a good correlation between the groups.

Cons: An auction-based split has one potential flaw in that the random assignment occurs every auction. Thus, the same user can be exposed to advertising from both groups A & B.

Right for incrementality testing?: NO! This flaw rules out such an approach for any kind of incrementality testing. The probability is high that intervention in one group will have an impact on the other. Furthermore, since there’s no clean separation of users between groups A and B, there’s really no value in looking at performance data without attribution. There’s no way to associate unattributed conversions or revenue with either group A or B.

Audience Split

How it works: An audience split assigns users to groups A and B randomly but reproducibly such that the same user will always be assigned to the same group. This is generally done using hashed cookies or other forms of a user identifier.

Pros: Like an auction-based split, this also creates a very random split of two well-correlated groups.

Cons: There are many limitations when it comes to incrementality testing. First, the split is only as good as your testing technology’s ability to identify unique users, which is trickier in today’s multi-screen, app-filled world. Cookie-based audience splits are likely to assign multiple devices/browsers from the same user to different groups. True audience-based splits are only possible only for publishers who have a high percentage of cross-device logins. In order to measure impact without relying on attribution, you need to be able to assign transactions to either group A or B. This is based on the user identifier, without the necessity of a preceding click or impression.

Facebook is able to make this assignment for transactions recorded by its pixel but it is not transparent. It doesn’t expose the user level audience assignments to allow third-party technologies to evaluate performance. A further weakness of audience-based splits is that they cannot be used for measuring offline impact. For example, in-store or call center, or offline ads such as TV and radio. This is because it’s very difficult to reliably connect online user identifiers to offline transactions.

Right for incrementality testing?: NO! Given the significant amount of drawbacks to this type of testing, this is also not the best approach.

geography split for incrementality testing
Geo Split

How it works: A geo-based split assigns users to groups using the ability to geo-target in both traditional and digital marketing campaigns. Geo splits generally work at the city or DMA level—cities or DMAs are randomly assigned to groups A and B.

Pros:  Geo splits significantly simplify measurement since one can easily look at both online and offline transactions by geo without having to perform attribution. They have the best potential to reduce the chance of intervention on one group influencing the other. A further advantage is that a geo-based split is highly transparent. You can easily evaluate the results of the test against multiple data sources — even those that weren’t considered in the planning of the test. It’s the only approach that allows you to measure halo effects such as the impact of investment in one channel on the revenue attributed to another.

Cons: Geo splits are less random than audience or auction-based splits. But using a split methodology to create balanced and well-correlated groups overcomes this problem.

Right for incrementality testing?: Yes…if done correctly! At Skai, we’ve been using the geo-based approach for A/B testing and incrementality testing for more than four years. We’ve applied machine learning approaches to create our own algorithm that creates geo splits with balanced and well-correlated groups. The geo-based approach can run successful tests that yield meaningful results and stand up to analytical scrutiny.

While this is certainly not an exhaustive list of splits, you won’t find another type that lends itself as well to an incrementality test as a geo split does.

What to Do Next?

Interested in learning more about how Skai can help you better test, execute, and orchestrate your digital marketing efforts? Contact us today to set up a discussion

Book a Meeting of Skai

  • Share on Facebook
  • Share on Twitter
  • Share on LinkedIn
  • Share via Email
  • Copy Link
    Copied!
Tags: Data, incrementality, Measure, Multi-Channel, Optimize, xTLx
  • Previous Post
  • Next Post

Subscribe to Updates

Media that matters.
Marketing that works.
© 2023 Kenshoo, Ltd. All Rights Reserved.
Privacy Policy. Cookie Policy. Recruitment Privacy Policy.
  • Connected Data
    • Market Intelligence
    • Our Approach
    • By Need
    • By Solution
  • Connected Strategy
    • Dynamic Marketing Mix
    • Budget Forecasting
    • Strategic Consulting
  • Connected Media
    • Overview
    • Retail Media
    • Paid Search
    • Paid Social
    • App Marketing
    • Auditing
    • Expert Services
  • Measurement
    • Incrementality
    • Experiments
    • Cross-Channel Attribution
  • Resources
    • Blog
    • Glossary
    • Case Studies
    • Training & Enablement
    • Developer Hub
Privacy Preference

We use cookies on our website. Some of them are essential, while others help us to improve this website and your experience.

Privacy Preference

Save All

Save

Accept Only Essential Cookies

Manage Cookie Preferences

Cookie Details Privacy Policy Imprint

Privacy Preference

Here you will find an overview of all cookies used. You can give your consent to whole categories or display further information and select certain cookies.

Save All Save Accept Only Essential Cookies

Back

Privacy Preference

Essential cookies enable basic functions and are necessary for the proper function of the website.

Show Cookie Information Hide Cookie Information

Name
Provider Owner of this website, Imprint
Purpose Saves the visitors preferences selected in the Cookie Box of Borlabs Cookie.
Host(s) .skai.io, skai.io
Cookie Name borlabs-cookie
Cookie Expiry 1 Year
Name
Provider Owner of this website
Purpose This cookie stores selections made by the user in the Accessibe tool in order to maintain those settings on future visits. These cookies help us make our website compliant with our obligations under US law.
Privacy Policy https://accessibe.com/privacy-policy
Cookie Name acsbState, acsbReset
Cookie Expiry n/a
Name
Provider Owner of this website
Host(s) skai.io
Cookie Name wordpress_sec_,wordpress_test_cookie,wp-postpass_*, wordpresspass_*, wordpressuser_*
Cookie Expiry Session / 1 Year

We use these cookies to enhance functionality and allow for personalisation, such as live chats, videos and the use of social media.

Show Cookie Information Hide Cookie Information

Accept
Name
Provider Owner of this website
Host(s) .chilipiper.com, skai.chilipiper.com
Cookie Name fs_uid, CHILI_PIPER_CLUSTER, guest-session, _sp_ses*, _sp_id*
Cookie Expiry Session / 2 Years
Accept
Name
Provider Owner of this website
Host(s) .comeet.co, www.comeet.co
Cookie Name visid_incap_, nlbi_#######, incap_ses_, referrer22_00a, incap_ses_1364_2167377
Cookie Expiry Session / 1 Year
Accept
Name
Provider Owner of this website
Host(s) skai.io
Cookie Name moduleFormPardotDownload
Cookie Expiry 30 days

Statistics cookies collect information anonymously. This information helps us to understand how our visitors use our website.

Show Cookie Information Hide Cookie Information

Accept
Name
Provider Google Ireland Limited, Gordon House, Barrow Street, Dublin 4, Ireland
Purpose Cookie by Google used for website analytics. Generates statistical data on how the visitor uses the website.
Privacy Policy https://policies.google.com/privacy?hl=en
Cookie Name _ga,_ga_*,_gat,_gat_*,_gid
Cookie Expiry 2 Months
Accept
Name
Provider Hotjar Ltd., Dragonara Business Centre, 5th Floor, Dragonara Road, Paceville St Julian's STJ 3141 Malta
Purpose Hotjar is an user behavior analytic tool by Hotjar Ltd.. We use Hotjar to understand how users interact with our website.
Privacy Policy https://www.hotjar.com/legal/policies/privacy/
Host(s) *.hotjar.com
Cookie Name _hjClosedSurveyInvites, _hjDonePolls, _hjMinimizedPolls, _hjDoneTestersWidgets, _hjIncludedInSample, _hjShownFeedbackMessage, _hjid, _hjRecordingLastActivity, hjTLDTest, _hjUserAttributesHash, _hjCachedUserAttributes, _hjLocalStorageTest, _hjptid, _hjSessionUser_2229986, _hjIncludedInPageviewSample, _hjIncludedInSessionSample, _hjAbsoluteSessionInProgress, _hjFirstSeen
Cookie Expiry Session / 1 Year

Marketing cookies are used by third-party advertisers or publishers to display personalized ads. They do this by tracking visitors across websites.

Show Cookie Information Hide Cookie Information

Accept
Name
Provider Linkedin
Cookie Name lidc, li_gc, lang, AnalyticsSyncHistory, UserMatchHistory, li_sugr, bcookie, TDCPM, TDID, bscookie, ln_or
Cookie Expiry Session / 1 Year
Accept
Name
Provider Skai
Accept
Name
Provider 6sense
Cookie Name _gd_session, _an_uid, _gd_visitor, _gd_svisitor, 6suuid
Cookie Expiry Session / 400 Days
Accept
Name
Provider Pardot
Purpose Cookie name associated with services from marketing automation and lead generation platform Pardot. The visitor value is the visitor_id in your Pardot account. This cookie is set for visitors by the Pardot tracking code.
Host(s) .pardot.com, pi.pardot.com, skai.io
Cookie Name pardot, visitor_id*, lpv*
Cookie Expiry Session / 10 Years
Accept
Name
Provider Google Ireland Limited, Gordon House, Barrow Street, Dublin 4, Ireland
Purpose Cookie by Google used for conversion tracking of Google Ads.
Privacy Policy https://policies.google.com/privacy?hl=en
Cookie Name IDE, 1P_JAR, NID, SOCS, CONSENT, AEC, _gcl_au, OTZ, test_cookie
Cookie Expiry Session / 400 Days
Accept
Name
Provider Meta Platforms Ireland Limited, 4 Grand Canal Square, Dublin 2, Ireland
Purpose Cookie by Facebook used for website analytics, ad targeting, and ad measurement.
Privacy Policy https://www.facebook.com/policies/cookies
Cookie Name _fbp,act,c_user,datr,fr,tr,m_pixel_ration,pl,presence,sb,spin,wd,xs
Cookie Expiry Session / 1 Year

Content from video platforms and social media platforms is blocked by default. If External Media cookies are accepted, access to those contents no longer requires manual consent.

Show Cookie Information Hide Cookie Information

Accept
Name
Provider Wistia
Host(s) .wistia.com
Cookie Name cb_anonymous_id, _sp_ses.2b40, _li_dcdm_c, __hssrc, _gcl_au, _clsk, hubspotutk, _sp_id.2b40, __hssc, __hstc, _uetsid, _uetvid, _gid, _ga, _ga_GQR109DZ3Y, _lc2, fpi, _ex-pricing-cta, _fbp, cb_group_id, cb_user_id, _clck
Cookie Expiry Session / 400 Days
Accept
Name
Provider Meta Platforms Ireland Limited, 4 Grand Canal Square, Dublin 2, Ireland
Purpose Used to unblock Instagram content.
Privacy Policy https://www.instagram.com/legal/privacy/
Host(s) .instagram.com
Cookie Name pigeon_state
Cookie Expiry Session
Accept
Name
Provider Openstreetmap Foundation, St John’s Innovation Centre, Cowley Road, Cambridge CB4 0WS, United Kingdom
Purpose Used to unblock OpenStreetMap content.
Privacy Policy https://wiki.osmfoundation.org/wiki/Privacy_Policy
Host(s) .openstreetmap.org
Cookie Name _osm_location, _osm_session, _osm_totp_token, _osm_welcome, _pk_id., _pk_ref., _pk_ses., qos_token
Cookie Expiry 1-10 Years
Accept
Name
Provider Twitter International Company, One Cumberland Place, Fenian Street, Dublin 2, D02 AX07, Ireland
Purpose Used to unblock Twitter content.
Privacy Policy https://twitter.com/privacy
Host(s) .twimg.com, .twitter.com
Cookie Name __widgetsettings, local_storage_support_test
Cookie Expiry Unlimited
Accept
Name
Provider Vimeo Inc., 555 West 18th Street, New York, New York 10011, USA
Purpose Used to unblock Vimeo content.
Privacy Policy https://vimeo.com/privacy
Host(s) player.vimeo.com
Cookie Name vuid
Cookie Expiry 2 Years
Accept
Name
Provider Google Ireland Limited, Gordon House, Barrow Street, Dublin 4, Ireland
Purpose Used to unblock YouTube content.
Privacy Policy https://policies.google.com/privacy?hl=en&gl=en
Host(s) google.com
Cookie Name CONSENT
Cookie Expiry 6 Month

Borlabs Cookie powered by Borlabs Cookie

Privacy Policy Imprint