Last year has been a turning point for digital attribution. The biggest question focuses on the future of the cookie as a reliable and future-proof cornerstone of tracking; and what the knock-on effect will be for vendors that have built solutions around it. We need to re-think our approach based on the data we have and how this translates into long-term solutions for our clients. Read here to know how to achieve perfect results by using imperfect data.
How The Attribution Landscape Changed
Last year, the market was at a point where we were making sense of the volumes of disparate data and start to use it to activate, report and optimise our media buys in real time. We had raw data that we could export to create our own custom attribution solutions, and in the best case we were able to automate this and use the results to feed into our bidding strategies, and this is what we were offering our clients at Merkle.
However, there were a series of major events that occurred throughout the year that have made us revise our long-term strategy. GDPR was of course the catalyst and Google were the first to act. The removal of their DoubleClick Data Transfer files (which was the most granular data we had on our digital activity) has negatively impacted the creation of bespoke digital attribution models for clients.
Further to that, there were further developments in the market which made us rethink our approach. This was Apple’s release of “Intelligent Tracking Prevention 2.0” (now 2.2) to which Mozilla Firefox has announced that it is following suit. Display has suffered the most and the most recent update to ITP means that display ads are no longer trackable on Safari anymore. This of course doesn’t just impact the measurement of this activity but also frequency capping and A/B testing.
The Current State Of The Digital Market: Platform Limitations and Data Gaps
So how reliable are impressions going to be moving forward? In the current state we are losing out on more than 20% of impressions through ITP which makes it hard to create accurate models to inform our media buys. It feels as if the market has shifted and I wouldn’t be surprised to see other browsers following suit. There are still some clear individual winners in the loss of the cookie which are the “walled gardens” such as Google, Amazon and Facebook that use logged in users rather than cookies but that doesn’t answer the question of how these channels fit together.
This of course has impacted all 3rd party vendors offering a granular option. As it stands there isn’t a tool currently on the market that covers all bases (deterministic cross device, impressions plus viewability, direct bidding connectors, forecasting) but that doesn’t mean that we give up and go back to reporting on last click, we have to be smarter in our approach.
A Three-Pronged Approach to Attribution
With these two key changes in mind we have pivoted our approach to digital attribution to one which we feel should hold up through these inevitable and continued changes in the market. We have decided to take a three-pronged approach, which allows us to answer the key questions from our clients; am I investing in the right channel? Which channel has the most impact? If I dialled up spend what would the impact be? Am I valuing the right channels? Am I optimising to the right data?
It starts with a top-level view on your channel mix utilising cost and conversion data. We use a media mix modelling approach with additional factors such as lag time in order to give an overall view on your channel mix and forecast accordingly. This is ideal for monthly budget adjustments and campaign planning. Then for in-flight optimisation and bid-adjustments we compliment this by using attribution solutions available for automated bidding within media buying platforms. We can also create custom solutions for clients that want a bit more control, but this will be limited to click data.
Underpinning all of this are ongoing tests which enables an ongoing evaluation of channels that are the most impacted by the loss of impressions e.g. Social and Display and ensure this is being accounted for when actioning insights uncovered by the detailed view. This would be carried out by leveraging regression modelling to gain additional insights in cases where data cannot be stitched together and A/B testing the impact of actioning all insights to verify the hypotheses.
A test and learn method is the most long-term approach that we can provide given the state of the market and will be the least impacted by the loss of cookie and impression data available. This is just the start of the attribution journey. Through testing and further analysis, we can move you away from just thinking about optimising conversions, we can start to move to optimising towards customer life time value. But this isn’t an easy feat, there are usually many blockers that can obstacle the journey and it takes the right people invested internally, the right technology and the right people to make sense of it.
How To Get The Best From Your Data
First, a Tech Stack mapping is fundamental to ensure we are aware of all tools at your disposal. We check: platforms links, taxonomy, cost imports, naming conventions, to make sure that what we report on is correct.
Second: start small. If you are new to attribution and you don’t know where to begin, we recommend taking small steps, starting with one channel at a time. Starting with your Paid Search, for instance, is a good way to drive quick wins in a limited period of time. Refer here to see how to do it.
Then, expand your horizon: cross channel management and optimisation is the next step.
Further to that, you can also use different platforms together, for example combining Facebook Attribution insights with Google Analytics data. Or you can go for a third party vendor if you think that the in-build solution is not the ideal for you.
If you would like to know more, please get in touch with our experts.