Skip to content
DataOps.live Professional EditionNEW
Purpose-built environment for small data teams and dbt Core developers.
DataOps.live Enterprise Edition
DataOps.live is the leading provider of Snowflake environment management, end-to-end orchestration, CI/CD, automated testing & observability, and code management, wrapped in an elegant developer interface.
Spendview for Snowflake FREE
Change the way your business makes decisions around data with a unified and harmonized view of your Snowflake spend.

Pricing and Editions

See whats included in our Professional and Enterprise Editions. 

Getting Started
Docs- New to DataOps.live

Start learning by doing. Create your first project and set up your DataOps execution environment.
Join the Community
Join the  Community

Find answers to your DataOps questions, collaborate with your peers, share your knowledge!
#TrueDataOps Podcast
#TrueDataOps  Podcast

Welcome to the #TrueDataOps podcast with your host Kent Graziano, The Data Warrior!
Resource Hub
On-demand resources: eBooks, white papers, videos, webinars.

Customer Stories
Academy
Enroll in the DataOps.live Academy to take advantage of training courses. These courses will help you make the most out of DataOps.live.

Learning Resources
A collection of resources to support your learning journey.
Events
Connect with fellow professionals, expand your network, and gain knowledge from our esteemed product and industry experts.
Blogs
Stay updated with the latest insights and news from our DataOps team and community.

#TrueDataOps.org
#TrueDataOps is defined by seven key characteristics or pillars:
In the News
In The News

Stay up-to-date with the latest developments, press releases, and news.
About Us
About Us

Founded in 2020 with a vision to enhance customer insights and value, our company has since developed technologies focused on DataOps.
Careers
Careers

Join the DataOps.live team today! We're looking for colleagues on our Sales, Marketing, Engineering, Product, and Support teams.
Roche - LabOps_header_replacement

Roche

Case Study

Doing now what patients need next

1b_Original-logo-with-tagline      Roche_Logo.svg AWS-logo-228w-x-80h

 

The DataOps.live platform is helping data product teams in this global pharmaceutical giant to orchestrate and benefit from next-generation analytics on self-service data and analytics infrastructure consisting of Snowflake and other tools using data mesh approach.

Challenges
  • How to improve data management and analytics to empower teams and drive the company's purpose?
Solution
  • DataOps.live platform enables a key capability for the self-service data and analytics infrastructure as part of the data mesh implementation, integrating Snowflake, AWS and other tools in a powerful new true DataOps approach.
Results
  • 120 releases a month vs 1 per Quarter
  • 6-8 weeks average MVP time
  • $40+ million in cost savings

AWS Solution focus:

For Roche PDIL, DataOps.Live was leveraged to construct orchestration pipelines for various workload use cases in Snowflake. These pipelines enable Roche to automate environment management within Snowflake, orchestrate third-party tools like Talend for data integration, and utilize data cataloguing tools such as Collibra. DataOps.live incorporates data quality checks and extracts metadata at each step to ensure observability and monitoring. All of this is achieved while maintaining an agile environment that allows developers to release frequently, often on a daily basis.

To integrate multiple data sources, it was decided to utilize AWS S3 as a central data lake. S3 offers the advantage of effectively organizing data in hierarchical formats, which then allows DataOps.live to establish lifecycle policies for data retention management. Since the customer already had DataOps.Live runners deployed in their AWS account, DataOps.live also configured appropriate AWS IAM policies and roles to grant access to the S3 buckets used for staging purposes.

The pipeline that have been constructed orchestrates a Talend job that retrieves newly integrated data from S3 and ingests it into the Integration Schema within Snowflake. DataOps.live then implement data quality tests on both the source and transformed data as it moves between different schemas (e.g., Integration schema, Processed schema, and Reporting schema). Ultimately, the metadata generated throughout the various stages of this pipeline is extracted and compiled into a formatted document, which is subsequently published to Collibra for cataloguing purposes.

Roche - Pathologist_no blur-min

“You can’t carry on doing the same things and expect different results. We wanted to move the needle further on the dial and become a more agile data-driven business, which led to a pioneering data mesh and true DataOps approach as our way forward."

Read Blog
Omar Khawaja headshot
Omar KhawajaGlobal Head of BI, Roche Diagnostics

View the Roche case study today