The Snowflake Marketplace turns three in June – and it marks an important milestone for both the company and the data-as-a-service industry as a whole. Through investments in both its sharing protocol and marketplace, Snowflake now plays an important role in the way many data teams not only manage, but explore and adopt external data.
We talk with Tom Gray, Principal of Financial Services Data Collaboration, at Snowflake about what the top companies are doing differently in its marketplace and how a new set of technologies are transforming the way data providers market their products. Tom was formerly a vice president at FactSet.
Snowflake launched its marketplace almost two years ago. How has the way data providers approached the marketplace changed?
When we launched, a lot of folks in the industry were skeptical. They viewed the marketplace as yet another marketing opportunity. A lot of the data marketplaces at that point were essentially data catalogs: they showed folks where data lived, but still relied on existing mechanisms like flat files or APIs to ingest the content. What made Snowflake different – and I think what continues to make it different – is the ability to ingest that data seamlessly. Data sharing remains the backbone of what we’re delivering.
One thing that’s surprised me is that companies have embraced putting sample data onto the Snowflake Marketplace. That’s a huge cultural shift for the industry, which has historically been very protective of their content. More and more large firms are embracing the idea that making actionable, accessible sample content available to prospects can substantially accelerate their sales cycle.
Let’s think about the top 10% of data companies succeeding on Snowflake. What are they doing that is unique?
The most successful companies are starting to tell stories with their data. Buyers do not just want to hear about the history or coverage of your dataset; they want to understand how to find value in your product. Our successful providers are not just explaining the use cases in their listings; they are providing actual SQL queries that potential buyers can use to explore those use cases in practice.
At Snowflake, we’re really focused on helping reduce that time-to-value. We’ve partnered with companies like Dataiku, ThoughtSpot and Sigma Computing to enable data buyers to explore data in the marketplace through basic visualizations – not just tables. And the acquisition of Streamlit also furthers our ability to provide an overlay on content allowing folks to tell more stories by visualizing their data in helpful and differentiated ways.
You spent eight years selling data at Factset. If you could waive a magic wand, which part of the data sales cycle would you change?
Arguably the biggest hurdle for consumers is in licensing. If you polled most of the financial services firms that are buying bulk datasets, the most common complaint you will hear will be about the licensing process. One of the areas that we see a lot of innovation is in the concept of monetization: the ability to buy data without having to go through an intensive sales process and negotiation. Many providers are realizing that if you put a fair price on the open market, companies will buy more data, faster.
A big part of this is simplification. Instead of offering 30 different use cases across 30 different packages, the smart data providers are trying to simplify their offerings. One offering I like a lot is the idea of a “data playground” – where data providers offer multiple datasets as a single package.
New routes-to-market do not just improve the experience for existing buyers; they often open up new markets altogether. Are you seeing new segments or personas starting to explore and buy external data on Snowflake?
A lot of the demand is still coming from existing buyers. In financial services, that might mean “market data” leaders who have bought data for years and who many of the providers know well.
But we’re also seeing new buyers as well, particularly on the IT and technology side since that’s where Snowflake grew up. With that, you have CTOs and Heads of Architecture that a normal data seller would not normally interact with. That access has helped providers scale across the enterprise: not only do they have the business buyers they know, but now they have access to folks who influence the technology budget as well.
There’s obviously upside for providers in improving the sales and onboarding process, but it’s the buyers who see much of the near-term benefit. Can you talk a bit about why data buyers prefer consuming data via sharing versus more traditional forms of delivery?
Honestly, our customers have been our best advocates. Some of the largest institutional asset managers – firms like NatWest – are going to their providers and saying: “Hey, we’re sick of managing these different APIs or going to this file server site to manage a nightly upload.” The amount of work that goes into it – and costs associated with the data engineering and pipeline-building required – goes away with these new models.
But at the end of day, this helps data providers. Faster delivery allows for quicker time-to-value which allows for more upsell opportunities. It allows them to extend and grow their data feed business substantially faster than they could do otherwise.
How has sharing and the marketplace changed the way data teams on Snowflake go about buying and using external data?
The ability to trial content quicker has been pretty transformative. In my previous life, that time-to-insight was typically a huge barrier. A customer would need to stand up a team that could learn your data feed loader then load the files and read the schema and build the database – all before someone could even start to query the data. What used to take 90 days, now takes 30. It definitely allows our customers to test, explore and potentially buy much more data than they could have before.
A lot of these technologies have lowered the barriers for companies to bring data products to market. Are you seeing non-traditional companies starting to market data products on the platform?
Absolutely. One of the things Snowflake offers is the ability to test fast. One of our banking customers offers a great example: they chose to start providing their holdings for their pension fund in a Snowflake Share so people could use that data within the platform. We’re also seeing some of our retail partners exploring ways in which they could potentially start productizing some of the datasets they are collecting. We’re seeing folks exploring how they can productize some of their data either by aggregating and anonymizing a dataset or sharing sensitive information via a clean room.
There’s a tendency for folks to constantly measure, but never cut when companies explore data monetization. The ease of delivery means that you can test out a dataset with a customer before you go full bore into the marketplace. If you haven’t sold data before, that’s a huge advantage.
What’s the biggest barrier for data providers to start building on the marketplace and sharing data on Snowflake? And what can they do to get over it?
One of the most common barriers is the fear of the unknown. If you’ve sold flat files for 20 years, and someone comes in to say “change that,” it can be scary. I always just say that Snowflake is just another option. It does not mean traditional methods go away entirely. A lot of our job is to just educate people about what data sharing is and how you can mimic some of the previous delivery models.
There’s also sometimes competitive concerns. Companies are afraid that competitors will see a dataset they’ve built and replicate their offering. We have some creative ways of encrypting content or offering listings privately. But when it comes to these concerns, I always ask a simple question: if your competitor is in Snowflake and you’re not, who do you think our account rep will bring up?
To learn more about how Bobsled can allow you to share data into Snowflake instantly, schedule a demo.