Nevertheless, data sharing will carry on, albeit with far more guardrails than anyone in the ad industry was subject to during the “open programmatic” years from 2010 to 2020 (RIP).
While marketers have already begun their pivot to this new reality by climbing aboard the bandwagon of customer data platforms (CDPs), publishers are unfortunately still waiting at the depot without a ticket.
Managing data overhead
CDPs have grown in importance for one main reason. Companies want to resolve all the data expressed through their websites, apps, databases and other owned channels into a unified profile. There are other use cases but that’s the main one. If a CDP can’t match a marketer’s own data sources to its own destinations, it’s not really a CDP.
What many of them don’t provide is the ability to match those audiences to outside partners, including publishers. For marketers, working with walled gardens both large and small requires managing the significant overhead of integrating with each publisher individually.
It’s not trivial to piece together an audience across many different publishers. The biggest walled gardens are already quite good at supporting this type of action, but smaller publishers have yet to get their act together. Faced with a lack of options, many fall back on the practice of sharing log files. This is questionable both from the standpoint of consumer privacy and publisher self-interest since the way those log files are merged with a buyer’s data is outside the publisher’s full control.
No surprise that we are now witnessing the birth of a small generation of new “CDP-adjacent” systems that offer more nuanced control over how intercompany data exchange can happen.
Retiring the ‘cookie match’
Importantly, these new systems won’t rely on the tried-and-true (but now critically endangered) cookie matching which has been the currency of digital advertising for so long. Rather they’ll be embedded within walled gardens, where they will target and resolve audiences using those gardens’ preferred identity systems.
This is a tricky technical problem, but the industry is working on it. One promising proposal from the IAB Tech Lab seeks to create a standardized way for publishers to signal the nature of their audiences in the bid request to prospective buyers. In the long tradition of inscrutable ad tech terminology, the title of this proposal is very dense: “Taxonomy and Data Transparency Standards to Support Seller-defined Audience and Context Signaling”.
Publishers could use a proposal like this within their own environment to take a list of encrypted IDs from a brand’s data environment and align it with their audience without exposing more data than necessarily.
Plenty of technical problems remain to be ironed out. For instance, one irony of this emerging reality is that in order to share data using these new systems, companies will have to house their data with multiple vendors. To solve for that we’re going to need some level of interoperability between these vendors.
If this seems excessively convoluted, consider how far we’ve come.
The dirty secret of the programmatic industry’s first 10 years is that the amount of data shared via cookie matches has often exceeded what is necessary for the partnership, not to mention what is wise from a consumer privacy standpoint. That’s because the tools haven’t existed for companies to match data sets directly in a way that is controlled and privacy-preserving.
It’s early days for data connectivity systems, but I’m increasingly confident that the next wave of programmatic will correct the “oversharing” of the past and help us build a far more protected data environment without sacrificing advertising effectiveness or reach.