The primary goal for anyone using marketing attribution today is to understand core advertising effects. Are you reaching the right people? Are your ads effective? Companies working toward this goal have some key measurement considerations: Namely, what data do you have available and how can that data be transformed into actionable insights? If history has taught us anything, it’s that this is a pursuit of constant improvement rather than perfection.
Technology changes and business evolves, creating opportunities for new measurement methods that address modern needs. As we see these changes, many marketers are left wondering where things are headed.
First, let’s remember how we got here. Economic growth in the U.S. after World War II came with a boom in advertising and subsequent demand for measurement from executives. In 1948, Neil Borden coined the term “marketing mix” as a way to help marketers understand the mixture of “ingredients” available to them. Marketing mix models gained commercial adoption beginning in the 1980s with the availability of scanner data in consumer goods companies — marketers still use these models today because they provide useful measures of macro drivers of sales.
In the 2000s, the internet transitioned from a dial-up novelty full of speculative websites to an economic engine powered by ads and digital commerce. Advertisers had a brand channel to reach consumers in new ways. For the first time, companies could target individuals with specific ads based on browsing history.
By 2010, Google’s dominance in the search market paved the way for more robust tracking. At the same time, the emergence of smartphones and social media completely transformed the way people interact with devices. More behaviors were being tracked and stored as data than ever before.
In 2006, multi-touch attribution (MTA) became available because of the need to leverage massive amounts of user-level data. This marked a turning point in marketing measurement and finally allowed marketers to track the impact of media on individuals. Before this, measurement was almost exclusively done in aggregate, leaving decision makers to think and plan at the market or national level.
By 2015, advances in data aggregation and machine learning opened the door to unifying online and offline advertising and conversions. This gave marketers a cross-channel view of their spending to improve media synergy and overall efficiency. In 2018, the unified approach was further enhanced when attribution models incorporated propensity. As a result, marketers finally could disentangle targeting and efficacy to understand the incremental effects of media without bias.
Marketing measurement methods (MMMs) have changed significantly over the past 40 years, but the principles and objectives remain the same. Marketers want to reach the right customers at the right time with the right media to eventually make a sale. While this might feel like an oversimplification, it ultimately drives progress. In the 1980s and 1990s, MMMs shaped the industry. In the 2000s, digital advertising and MTA challenged conventional wisdom and forced measurement to evolve.
This begs the question: Why is advertising so difficult today if we have such advanced measurement tools? Before online tracking, measurement was difficult because consumer data was scarce and expensive to collect. Data has become ubiquitous, and the challenge is knowing what to do with it. From display ads and search to Facebook, Instagram, YouTube, and over-the-top media services, the volume and complexity of channels are expanding rapidly. Without advanced machine learning and AI, marketers will never be able to keep pace with technology.
As MTA gains widespread adoption with more data available for tracking consumer behavior, marketers must keep an eye on the future and be ready to adopt new methods. Even today, advertisers struggle to measure ad effectiveness due to the complexity and bias that is difficult to parse out in large data sets. As we look toward the future of attribution, we expect to see these issues addressed in a variety of possible ways.
At Conversion Logic, we’re focused on building a next-generation platform that can quickly integrate new data, channels, and methods. Currently, we’re identifying natural biases in the data to account for propensity and measure the incremental effects of advertising. That is, we account for activity bias (active online users are more likely to see ads, visit sites, etc.), targeting bias (active users are more likely to be targeted), and self-selection bias (users who decide to search/click themselves). We believe these methods address key obstacles with current MTA approaches and lay the foundation for the next wave of available methods.
We have also partnered with Inscape and LiveRamp to integrate TV tracking and an identity solution that bridges offline and online data. Using automated content recognition, Inscape can track when an ad actually is viewed for more than 10 million opted-in TVs rather than monitoring spot-level data. LiveRamp matches PII data with anonymous identities such as cookies and device IDs to provide access to information from over 170 million consumers, covering 95% of adults with greater than 95% accuracy when a match occurs. This level of resolution in consumer data is the future of marketing measurement. More complete information about individual ad exposure and purchase decisions paired with the power of machine learning will significantly enhance measurement accuracy and the quality of insights.
Ultimately, we believe attribution is going to inch closer and closer to truly measuring the incremental effect of advertising on consumers. While it’s impossible to run completely controlled experiments on every campaign, channel, and time period, we can use big data and machine learning to produce more accurate measures of propensity and incrementality than ever before. In the end, marketers want to know what works and what doesn’t so they can drive the ROI of media consistently over time. Through relentless innovation today, measurement experts are aggregating data and developing models that will drive accuracy and insights tomorrow.