Does the ‘Big’ in Big Data Matter?
Back to FreeWheel Views
Sony Joseph
By Sony Joseph,
VP, Data Products,
FreeWheel

I recently had the opportunity to participate on a panel discussion at the Broadcasting & Cable Advanced Advertising Summit called, “Big Changes From Big Data,” along with big data experts from ABC, 4C Insights, Discovery Communications, and iSpot.tv. In a compelling conversation on a complex topic, one thing remains clear: it’s really not the lack of scale, variety, or richness of data that is holding the industry back, nor is it a result of deficient technology.  The heart of the challenge we face is how the premium video economy can align on a shared context around the data and tap into it as an ecosystem.  Here’s my current take on the state of big data in the industry.

Balkanization of data

Today, data is siloed not only across the various partners within the ecosystem, but also fragmented by devices and environments within each entity. Sharing of data with the various players is hard due to lack of trust, transparency, and data leakage concerns. Because different partners are trying to overcome these challenges in disparate ways, such as via bi-lateral data sharing or creating local DMPs, we end up with less than optimal outcomes with everyone leaving money on the table.

One thing that can truly benefit the industry as a whole is if we can find the means or mechanisms to share data in a way that respects the rights of each data owner and in a fully trusted and decentralized manner, greasing the data wheels across the entire ecosystem. This would enable advertisers to buy media more effectively across all devices and environments and significantly increase the value of ad inventory for Publishers by orders of magnitude.

Harmonizing data across the ecosystem

Does using specialized data sets create impact at scale? The market tells us that it gets harder to harmonize the interpretation of data, and its value, as it gets more sophisticated. Most of ad spend continues to revolve around simple and easy to measure data points like demo and reach. As a result, the desire to innovate is dramatically reduced both on the Publisher side, around newer types of data, and at the advertiser end, on how they can act on it, since not everyone values or interprets the newer types of data the same way.

One of FreeWheel’s strategic pillars is unifying data sets and currencies to make it easier for the ecosystem to cooperate around, and enable the re-aggregation of scale across all types of data. This creates the ability to transact on multiple data sets, currencies, and audience segments, thus streamlining the associated workflow, and the in-depth insights needed to better understand the audience and add value to inventory.

Bridging the science of data with the art of content + context

How can we map internal organization structures around information flow to create the right blend of deep data science and the essence of content/context that has been built over time? This is critical as it allows both sides to understand and speak each other’s language, as opposed to creating ivory towers of siloed data science practice hidden within the organization.

At FreeWheel, we are testing this hypothesis by embedding our data scientists at the juncture of various functions. This could unleash the power of data to make it work, not only across the targeting lifecycle, but also content execution. And in the process, this structure will uncover those priceless intersections where the ad message, content, and context resonates.

The bottom line is yes, the ‘big’ in big data matters. We believe the ‘big’ in big data is going to trigger another layer of innovation in advanced targeting. FreeWheel’s current focus is to unlock existing barriers for the premium video ecosystem. Stay tuned for future posts diving deeper into the big data conversation.

engage