DataOps.live is the leading provider of Snowflake environment management, end-to-end orchestration, CI/CD, automated testing & observability, and code management, wrapped in an elegant developer interface. Faster development, parallel collaboration, developer efficiencies, data assurance, simplified orchestration, and (data) product lifecycle management are the result.

DataOps-building-slide-template_NEW (1)
Kent-Graziano-headshot__circle_frame

“ I speak to our biggest customers all over the globe frequently, and many of them ask me about agility and governance. They ask questions like... how can I do CI/CD for Snowflake so I can deliver value faster? DataOps.live are at the leading edge of the DataOps movement and are amongst a very few world authorities on automation and CI/CD within and across Snowflake.”

Snowflake-Logo-for-Palladium

Kent Graziano,
former Chief Technical Evangelist at Snowflake

Automate – Snowflake Object Lifecycle Engine

Manage your Data Cloud the same way you manage your Public Cloud (AWS, Azure, or GCP). Automate your data environments and infrastructure with code. Snowflake has become the first fully programmable Data Cloud allowing companies to build code and configuration templates external to Snowflake and then “run” that code on the Snowflake Data Cloud. Build and rebuild environments with ease. React to failures with speed – rollback changes immediately without impacting the data – or recover from complete failures and rebuild in a fresh Snowflake tenant in minutes and hours instead of days, weeks, or months.

Build and rebuild your data environments!

Drive your data platform from a set of code and configuration files so that you can manage the build and deploy of your data applications and data products with the same agility and governance you have with software applications.

Build and Rebuild

Manage all Snowflake Objects with Code

Beyond tables and views, you can also lifecycle manage ALL the additional ~30 objects in Snowflake using code and configuration. Automatically add new data warehouses with a few lines of code.

Object Lifecycle Management

Something always goes wrong!

A new feature breaks, bad data arrives, you name it. It is not IF but WHEN. Our new declarative approach to building Snowflake environments allows you to roll the Snowflake data and infrastructure backward if there is a major failure.

Roll back animation_new-1

Automate – Modeling and Transformation

The “T” in ELT is a critical element in every data pipeline. This provides the capability to model and transform your raw data into different design patterns to enable you to work with it.

Need to model and transform your data?

No problem. Just use our enhanced dbt-based modeling engine with advanced security and backward compatibility with all your existing dbt models.

Need to transform your data

Embed governance in your modeling

Drive advanced Snowflake Governance with built-in support for Grants, Row Access Policies, Dynamic Masking Policies, and TAGs.

Embed Governance in your Modelling

Model | Test | Model | Test

Create advanced data transformation flows and interleave transformation, testing, and other data processing jobs in the same pipeline.

Model Test Model Test

Full Data Vault 2.0 Support

Automate your data vault 2.0 modeling with embedded support via dbt vault macros.

Full Data Vault 2.0 Support

Orchestrate

Orchestrate every part of your pipeline

Leverage the power of advanced Orchestration capabilities for ALL your data integration and data movement platforms with enhanced connectors for your most popular tools.

Logos_new-1

Enterprise Pipeline Scheduling

Employ the power of our advanced scheduling and management capabilities to remove disconnected dependencies between systems.

Enterprise Pipeline Scheduling

Complex DAG Workflows

Build complex DAG routing flows and logic to embed the rules of what runs, when it runs, what depends on, and what depends on it.

DAG_new

Test & Observe

Observability and automated testing are crucial concepts from the DevOps world that deliver trust and reliability to stakeholders of applications developed that way. But the data world has always suffered without these capabilities.

Automated regression testing

Run automated data regression testing to assure the quality of the flowing data at every point. Add metadata reporting to every data pipeline to monitor data quality KPIs over time.

Automated Regression Testing

ADVANCED ALERTING AND INTEGRATION

Rich webhooks for alerting and automation allow for advanced integration into hundreds of other tools and platforms. Alert engineers, create tickets and post chat messages. Two way integration also allows DataOps to be fully controlled from other platforms via standard REST APIs.

integromat

(Coming soon) Role & security Explorer

Use our native Role & Security Explorer for Snowflake to generate full security exports.

Role and Security Explorer

Develop & Deploy

Gitflow Deployment and Management

Full environments build management for dev/test/prod/fb to support branch and merge gitflows. Standalone or fully mirror our backend git repository with your enterprise source code repository.

Gitflow Deployment and Management

Increase Agility with Concurrent Data Development

Any number of developers can work independently in their own safe sandboxes without stepping on each other's toes, massively increasing developer efficiency.

High Concurrency Agile Development

Architecture

“It's all about the architecture!!” These were the first words we ever heard about Snowflake from Bob Muglia in London in 2017. And he was right – IT IS all about the architecture – and the same is true of DataOps. We separate cloud and on-premises. Everything in the cloud is Kubernetes… everything on the client is a simple Docker image that runs behind the client’s firewall. Zero data leakage to the cloud.

Maximum Security

Leverage our cloud / on-premises delivery to model for maximum data security.

Maximum Security

Inheriting Policies and Governance

Enterprise Inheritance structures to enable governance and policy enforcement with local autonomy and agility.

Inheriting Policies and Governance

Devops (#TrueDataops) to the core

Grounded in core DevOps capabilities with only the necessary changes added to support the nuances of building, testing and deploying data platforms with terrabytes of data and more.

DEVOPS to the Core

GET STARTED TODAY

Download Eckerson Deep Dive

This report examines four leading DataOps platforms: DataKitchen, DataOps.live, Zaloni, and Unravel. It describes each product, highlights its key differentiators, and identifies target customers for each. From
these profiles, readers will gain a better understanding of the range of DataOps offerings and discover which products are best suited to their needs.

"DataOps.live is a great choice for companies that need orchestration of complex data pipelines around Snowflake, especially if they rely on a lot of IoT data. Its hybrid approach means you can use native development and testing tools for some data pipeline processes, and orchestrate any 3rd party tools as required."

DOWNLOAD NOW

logo-eckerson-big

 

Eckerson Group - Deep Dive - Dataops.live v2-1