Skip to content
DataOps.live Professional EditionNEW
Purpose-built environment for small data teams and dbt Core developers.
DataOps.live Enterprise
DataOps.live is the leading provider of Snowflake environment management, end-to-end orchestration, CI/CD, automated testing & observability, and code management, wrapped in an elegant developer interface.
Spendview for Snowflake FREE
Change the way your business makes decisions around data with a unified and harmonized view of your Snowflake spend.

Pricing and Additions

See whats included in our Professional and Enterprise Editions. 

Getting Started
Docs- New to DataOps.live

Start learning by doing. Create your first project and set up your DataOps execution environment.
Join the Community
Join the  Community

Find answers to your DataOps questions, collaborate with your peers, share your knowledge!
#TrueDataOps Podcast
#TrueDataOps  Podcast

Welcome to the #TrueDataOps podcast with your host Kent Graziano, The Data Warrior!
Technology Partners
Resource Hub
On-demand resources: eBooks, white papers, videos, webinars.

Learning Resources
A collection of resources to support your learning journey.

Customer stories
Events
Connect with fellow professionals, expand your network, and gain knowledge from our esteemed product and industry experts.
#TrueDataOps.org
#TrueDataOps is defined by seven key characteristics or pillars:

Academy
Enroll in the DataOps.live Academy to take advantage of training courses. These courses will help you make the most out of DataOps.live.
Blogs
Stay informed with the latest insights from the DataOps team and the vibrant DataOps Community through our engaging DataOps blog. Explore updates, news, and valuable content that keep you in the loop about the ever-evolving world of DataOps.
In the News
In The News

Stay up-to-date with the latest developments, press releases, and news.
About Us
About Us

Founded in 2020 with a vision to enhance customer insights and value, our company has since developed technologies focused on DataOps.
Careers
Careers

Join the DataOps.live team today! We're looking for colleagues on our Sales, Marketing, Engineering, Product, and Support teams.
DATAOPS TEST3

Build, Test, and Deploy Data Products and Applications on Snowflake

Supercharge your data engineering team. Build 10x faster and lower costs by 60% or more.

14+M Jobs Orchestrated
155K Data Regression TestsRun Per Day
26+ Orchestrators: Talend, Fivetran, Matillion, dbt, etc.

"DataOps.live is the leading Unified Control Plane to manage, observe, and improve, your Snowflake Data Cloud."

Developer Experience

 

Banners 2000 X 500

What are data professionals saying?

“I speak to our biggest customers all over the globe frequently, and many of them ask me about agility and governance. They ask questions like... how can I do CI/CD for Snowflake so I can deliver value faster? DataOps.live are at the leading edge of the DataOps movement and are amongst a very few world authorities on automation and CI/CD within and across Snowflake.”

Kent-Graziano
Kent GrazianoFormer Chief Technical Evangelist at Snowflake

“You can’t carry on doing the same things and expect different results. We wanted to move the needle further on the dial and become a more agile data-driven business, which led to a pioneering data mesh and true DataOps approach as our way forward.”

Omar-Khawaja
Omar KhawajaGlobal Head of BI, Roche Diagnostics

“DataOps.live and Snowflake got us from vision to production in frankly terrifyingly short time. Most importantly they have done that while organization, the systems, and the data is constantly changing. Creating clarity and repeatability in this hugely dynamic context, is their unique advantage!!.”

David-Bath
David BathVP of Platforms for OneWeb

"DataOps is at the heart of our Data Mesh implementation. Everything from ingestion, transformations, DQ, access policies and data product governance is orchestrated by DataOps. DataOps is a must for any company wishing to implement Data Mesh. A distributed architecture requires a tool like DataOps to allow the regular data engineer to easily navigate the complexity of CI/CD, code release and orchestration"

Read Roche Case study ->

Paul-Rankin
Paul RankinHead of Data Management Platforms, Roche Diagnostics

Developer experience

Build data products 10x faster

Improve your developer experience with a one-click cloud-hosted IDE with full support for:

  • Code Development (SQL, Scala, Java, Python, Snowpark, Streamlit, dbt, etc.)
  • Code Validation
  • Testing and review
  • Curated libraries and resolved version conflicts

Start immediately, even for the most complex use cases with an environment similar to the runtime environment.

 

DDE

Data product manifest

DataOps.live provides a Data product registry and a Data Product Manifest object for each Data Product. The manifest is an organizing document that acts as an aggregation of all the essential attributes of a Data Product. The manifest describes a list of the entities in the data product, gives a detailed schema description for each entity, known relationships among entities, Service Level Indicators and so much more.

DataProductsManifest- multiple

Enterprise Gitflow standardized

DataOps.live uses 100% standard Gitflow processes to keep the developer experience simple.  Use standalone or fully mirror our backend git repository with your enterprise source code repository (Github, Bitbucket, Gitlab, etc.). 

Full branch and merge git flows for dev, test, prod and feature branches. 

Picture16-2

Maximize agility with concurrent data product development

Any number of developers can work independently in their own safe sandboxes without stepping on each other's toes. 

Parallel development massively increases developer efficiency and speeds up the creation of new data products or enhancements.

Picture17-1

Automate regression testing for EVERY deployment

Run automated regression testing to ensure the quality of the flowing data at every point. 

Add metadata reporting to every data pipeline to monitor data quality KPIs over time.

Picture14-1

See the developer experience in action

Environments

Build and rebuild your DEV, TEST & PROD environments

Snowflake is the first fully programmable Data Cloud, allowing companies to build code and configuration templates external to Snowflake and then “run” that code on the Snowflake Data Cloud. 

Manage the build and deployment of your data products and applications on Snowflake with the same agility and governance you do for software applications.

Picture7-3

Manage all Snowflake objects as code and configuration

Beyond tables and views, you can also lifecycle manage ALL the additional 30+ objects in Snowflake using code and configuration. 

Automatically add new data warehouses with a few lines of code.

Picture8-2

Something always goes wrong!

It's not IF but WHEN. A new feature breaks, bad data arrives, you name it. 

Our new declarative approach to building Snowflake environments allows you to roll the Snowflake data and infrastructure backward if there is a major failure. 

React to failures with speed. Roll back changes immediately without impacting the data or recover from complete failures by rebuilding in a fresh Snowflake tenant in minutes and hours instead of days, weeks, or months.

Roll back animation_new-1

Check out environments and modeling

Modeling

Model and transform your data with ease

The “T” in ELT is critical to every data pipeline. Transformation provides the ability to model and transform your raw data to enable you to work with it. 

Use our enhanced dbt-based modeling engine with advanced security and backward compatibility with all your existing dbt models.

Picture9-2

Embed governance inline with your modeling

Snowflake provides industry-leading features that ensure the highest levels of governance for your account, users, and all the data you store and access in Snowflake.

  • Column-level Security
  • Row-level Security
  • Object Tagging
  • Tag-based Masking Policies
  • Data Classification

Drive advanced Snowflake Governance with built-in enhanced support for Grants, Row Access Policies, Dynamic Masking Policies, TAGs, and so much more.

 

Picture10-2

Model | TEST | Model | TEST

Create advanced data transformation flows and interleave transformation jobs, testing jobs, and other data processing jobs in the same pipeline.

Picture11-3

Automate Data Vault 2.0 modeling

Embedded support for dbt-vault macros lets you automate your data vault 2.0 modeling. 

Extend this even further with our native support for VaultSpeed

Picture12-3

Orchestration

Orchestrate every part of your data pipelines

Leverage the power of advanced Orchestration capabilities for ALL your data integration and data movement platforms with enhanced connectors for your most popular tools.

Learn how orchestration enables self-service data at OneWeb: watch video.

 

Logos_new-1

Schedule your enterprise pipelines

Employ the power of our advanced scheduling engine and operational management capabilities to remove disconnected dependencies between systems.

 
Picture13-3

Simplify building complex DAG workflows

Build complex DAG routing flows and logic to embed the rules of what runs, when it runs, what depends on, and what depends on it.

DAG NEW

Observability

 

 

Comprehensive observability

Change the way your organization makes decisions about data with complete visibility. Get a unified and harmonized view of relevant metadata for better data reliability. Empower every business owner with complete control and understanding of all elements of their data products, building trust. 

Observability Circle

Advanced alerting and integration

Rich webhooks for alerting and automation allow for advanced integration into hundreds of other tools and platforms. Alert engineers, create tickets and post chat messages.

Two way integration also allows DataOps to be fully controlled from other platforms via standard REST APIs.

MicrosoftTeams-image (4)

Observability for data product owners

  • Assess the cost performance of Data Products
  • Monitor usage and quality of Data Products
  • Impact assessment and audit reporting on Data Products
  • Manage data teams and  data domain budgets
  • Evaluate the complexity of data initiatives
ObservabilityforDataProductOwners
Coming soon

Observability for data product developers

  • Monitor the health of data pipelines 
  • Monitor and optimize infrastructure costs 
  • Investigate the blast radius of data issues
  • Access the breakdown of data pipelines

 

ObervabilityforPipelines

Observability for Data Engineers

  • Get access to rich metadata in your Development Environment
  • Investigate issues and errors
  • Increase code re-usability 
  • Build better Data Products faster
  • Compare logs between jobs runs
ObervabilityforDataEngineers

Get a customized tour of DataOps.live

Architecture

Maximum security

“It's all about the architecture!!” These were the first words we ever heard about Snowflake from Bob Muglia in London in 2017. And he was rightIT IS all about the architectureand the same is true of DataOps.

We separate cloud and on-premises. Everything in the cloud is Kubernetes… everything on the client is a simple Docker image that runs behind the client’s firewall. Zero data leakage to the cloud.

Leverage our cloud/on-premises delivery to model for maximum data security.

 
Maximum security diagram

Inheriting policies and governance

Enterprise Inheritance structures to enable governance and policy enforcement with local autonomy and agility.

 
 
image001 (79)

Grounded in pure #DEVOPS (#TRUEDATAOPS)

Grounded in core DevOps capabilities with only the necessary changes added to support the nuances of building, testing and deploying data platforms with terabytes of data and more.

Register for all the latest thinking at www.truedataops.org.

Picture19-1

Request a demo

Speak with a DataOps.live expert today.

Spendview for Snowflake

Change the way your business makes decisions around data with a unified and harmonized view on your spend.