Unpacking Snowflake's Evolution on DevOps
Over the years, Snowflake has revolutionized how businesses leverage data. The Snowflake Cloud Data Platform’s unmatched scalability, flexibility, and power are transformative for organizations of all sizes.
However, even the most advanced data platform needs the right setup and processes to unlock its full potential. That’s where DevOps and DataOps come in. These frameworks guide the tools and processes you need to orchestrate software and data engineering workflows for peak performance at scale.
At Snowflake Summit this year, Snowflake announced new capabilities that serve as building blocks for DevOps. Rithesh Makkenna, Snowflake’s Global Head of Partner Sales Engineering, recently joined me to discuss the growing need for DevOps and DataOps and what Snowflake and DataOps.live are doing about it.
If you don’t feel like watching the webinar, here are the highlights from that conversation, including various tools to implement DevOps and DataOps, the solution Snowflake uses, and actions you can take to optimize your Snowflake instance.
The Role of DevOps in Snowflake Workflows
DevOps and DataOps are the foundation for building efficient and scalable data operations.
- DevOps, short for development operations, applies principles like continuous integration and delivery (CI/CD) to streamline software development workflows and improve collaboration.
- DataOps, short for data operations, extends these concepts to data processes to ensure faster, more reliable delivery of data products.
As use cases become more complex and organizations need to build more data products, engineering teams need applications to implement DevOps and DataOps.
During our talk, Rithesh and I discussed how organizations leveraging Snowflake produce more data products than ever. Initially, most Snowflake users just wanted a better way to produce tables and views of their data. Today, people are increasingly looking for more complex outputs, such as machine learning models, containers, or React applications.
Data products now have more components—instead of just a table, a modern data product might have a SQL interface, an application interface, and an API. In essence, by making data so accessible, Snowflake spurred an explosion of innovation that demands a new level of orchestration and collaboration.
This is where DataOps.live makes its mark. At this point it’s important to understand what DataOps isn’t—DataOps.live doesn’t do data ingestion, transformation, sharing, or governance. Instead, the platform orchestrates all those pieces. It serves as the end-to-end control plane, automating workflows and seamlessly integrating them into Snowflake’s architecture. That way, your team can ship data products faster, more consistently, and with fewer bottlenecks.
Why Use DataOps.live for DevOps and DataOps on Snowflake
There are plenty of options out there when it comes to implementing DevOps and DataOps for Snowflake. Tools like Apache Airflow, Jenkins, and Azure DevOps are infinitely flexible. You could absolutely use them for Snowflake environment management, infrastructure as code, native app building, Streamlit apps, and containerization—but you’d have to write the code yourself.
It’s hard to watch data engineering teams turn themselves inside out to essentially become software developers. Given my software engineering background, I saw the need for a unified developer experience that would make the process easier.
“If you want a gazebo, don't buy a pile of wood, nails, and a hammer; buy a gazebo. If your goal is to throw a party, then spend your time on throwing the party, not building a gazebo. It’s the same thing if you're a data engineering team—be a great data engineering team, don't try and be a software development team.”
Justin Mullen and I co-founded DataOps.live with a vision to create a place where it didn’t matter what you were building—whether you were working in SQL or other code or trying to build a container service or a native application.
With DataOps.live, everything looks and feels the same, dramatically reducing the technical barrier to building data products. We started out building capabilities predominantly for data engineers, including CI/CD and observability. More recently, we’ve built DataOps.live Create for business users. Nontechnical users in lines of business—say finance, HR, logistics, or supply chain—can use Create to build a Snowflake Native App in ten minutes.
Snowflake and DataOps.live: A Deep & Strong Partnership
It’s no secret that Snowflake recognizes DataOps.live as an Elite level technology partner—one of only 19 worldwide. DataOps.live is a natural fit for Snowflake users because we build specifically for Snowflake.
What’s less known, Rithesh told me, is that Snowflake’s appreciation for DataOps.live is more than theory. “Not many people are aware of how we, specifically in the Sales Engineering department, use DataOps.live at scale for the SE organization globally.” Learn more about why Snowflake chose DataOps.live for this business-critical project and how the resulting Snowflake Solutions Center has revolutionized workflows for 1400 sales engineers.
“DataOps.live delivered in a massive, massive way and we couldn't imagine where we'd be now without their partnership.”
— Vernon Percival Tan, Senior Manager frostbyte Industry Solutions, Snowflake
As a Snowflake power user and the only DataOps solution provider geared specifically for Snowflake, we also provide input into product capabilities. Snowflake’s engineers often come to us to learn more about DevOps and DataOps use cases and share design documents with us.
For example, if you paid attention to announcements at Snowflake Summit 2024, you may have learned about a new Command-Line Interface (CLI). The new CLI enables developers to automate tasks and workflows, a building block for DevOps. We’ve been using this CLI since pre-1.0 and wrapping our Snowflake Native App creator around it.
We love these features as a starting point for DevOps. Organizations that need orchestration to implement scalable DataOps can then come to us.
Actions You Can Take to Maximize Your Snowflake Instance
Over the course of our discussion, Rithesh and I touched on several actionable steps teams can take to make data workflows more efficient and scalable. Take a look:
1. Automate Change Management
Poorly managed changes are a silent killer. As Rithesh put it, "Even though it looks as if managing schemas is the simplest thing to do, that is where the majority of organizations face complexity in change management.”
To tackle this, adopt automated tools like Snowflake’s new Database Change Management (DCM) functionality for schema management to reduce errors and ensure consistency across your environment.
2. Adopt Unified Developer Experiences
Data products have grown so complex that you need to keep a team of five just to have the language skills to maintain them.
With a unified developer experience on top of Snowflake that switches between languages, it’s much easier to hand off projects and collaborate without needing all those niche technical skills.
3. Leverage Generative AI
Generative AI has been a game-changer, dramatically reducing technical barriers because people can do so much more with less technical expertise. However, it can also be an unexploded bomb because while you can spot an error in a table, Large Language Models (LLM) have the ability to make bad data look plausible.
Adopt app development tools that leverage LLM technology wisely. Used correctly, this technology can empower your business users to build their own simple data products so you can reserve data engineers’ efforts for bigger and harder problems.
4. Focus on Collaboration Across Teams
Bad logging and chaotic communication are a recipe for disaster. Streamlining processes and automating workflows builds transparency and makes collaboration the default.
Make sure the tools you use actively promote cross-team collaboration with data engineering, software engineering, and business users.
Surprise surprise, orchestration with DataOps.live is the strongest and most straightforward way to accomplish all of this and more. As Rithesh says, “they’re enabling the DataOps perspective specifically focused on Snowflake components, unlike any of the open-source platforms on the market today.”
If you’re still curious about how Snowflake’s new DevOps features align with DataOps.live, watch my full conversation with Rithesh.
Looking Ahead to 2025
As we wrap up 2024, it’s clear that the pace of innovation in data operations isn’t slowing down anytime soon. Organizations are building more data products, adopting more sophisticated workflows, and expecting faster results.
Looking ahead to 2025, I see an even greater need for DataOps that orchestrates collaboration between IT and business, automates the heavy lifting, and helps diverse teams focus on delivering value. DataOps.live will continue to focus on what we do best: enabling Snowflake users to orchestrate their data workflows efficiently and reliably.
If you’re already using Snowflake, now is the perfect time to assess how your processes stack up. And we're here to help! Talk to one of our experts today.
