Skip to content
DataOps.live Professional EditionNEW
Purpose-built environment for small data teams and dbt Core developers.
DataOps.live Enterprise Edition
DataOps.live is the leading provider of Snowflake environment management, end-to-end orchestration, CI/CD, automated testing & observability, and code management, wrapped in an elegant developer interface.
Spendview for Snowflake FREE

An inexpensive, quick and easy way to build beautiful responsive website pages without coding knowledge.


Pricing and Edition

See whats included in our Professional and Enterprise Editions.

Getting Started
Docs- New to DataOps.liveStart learning by doing. Create your first project and set up your DataOps execution environment.
Join the Community
Join the CommunityFind answers to your DataOps questions, collaborate with your peers, share your knowledge!
#TrueDataOps Podcast
#TrueDataOps PodcastWelcome to the #TrueDataOps podcast with your host Kent Graziano, The Data Warrior!
Academy
DataOps AcademyEnroll in the DataOps.live Academy to take advantage of training courses. These courses will help you make the most out of DataOps.live.
Resource Hub
On-Demand Resources: eBooks, White Papers, Videos, Webinars

Learning Resources
A collection of resources to support your learning journey.

Customer stories
Events
Connect with fellow professionals, expand your network, and gain knowledge from our esteemed product and industry experts.
#TrueDataOps.org
#TrueDataOps.Org#TrueDataOps is defined by seven key characteristics or pillars:
Blogs
Stay informed with the latest insights from the DataOps team and the vibrant DataOps Community through our engaging DataOps blog. Explore updates, news, and valuable content that keep you in the loop about the ever-evolving world of DataOps.
In The News

In The News

Stay up-to-date with the latest developments, press releases, and news.
About Us
About UsFounded in 2020 with a vision to enhance customer insights and value, our company has since developed technologies focused on DataOps.
Careers

Careers

Join the DataOps.live team today! We're looking for colleagues on our Sales, Marketing, Engineering, Product, and Support teams.
businessman working with  modern technology as concept
Jevgenijs Jelistratovs - Director of Governance Products & Partner Success, at DataOps.liveMar 12, 2024 1:01:37 PM5 min read

Build Fully Governed Data Products in 10 Minutes!

Working with clean and accessible data is a treat. The insights you discover and occasionally unexpected conclusions that counter your initial assumptions are the fun stuff of data work! 

But getting to that fun stage—that is hard work. 

Building production grade data pipelines that ensure that you and your end users are interacting with high quality data can be a never-ending process, especially in larger organizations. It is an enormous amount of work! 

The DataOps.live platform eliminates much of this technical burden and allows you to focus on rapid adoption of value-based data delivery. Our customers have used it to scale their data product factories to almost unbelievable velocity of deploying thousands of data products per quarter. 

 

DataOps.live empower data engineers to deliver with speed, trust, and agility!

Today I want to share exciting news with you. We’re bringing some of the maker’s fun that data engineers have to data analysts and data product owners. Imagine if data product owners or data analysts could go and configure the very first data product through an easy chat with LLM. Now with DataOps.live, they can do just that! 


Announcing
DataOps.live Create with AI-copilot Assist. 

About DataOps.live Create  

Create is a new module in the DataOps.live platform that allows next level automation focused on simplicity. While the core platform allows reliability and scalability, Create allows simplicity and speed for customers, seamlessly moving to a scalable mode of operation when needed. 

If you are a new customer and just started with Snowflake, you can immediately benefit from DataOps.live. To build your first “analytical data product,” start from your DDL (Data Definition Language) or the raw data layer of an existing database and end up with a production-ready pipeline. Alternatively, convert your dbt Core workloads with the dbt Quickstart.

 BlogPicture1_CreateDashboardScreen

About the use cases 

Analytical Data Product  

Who’s it for? Data product owners and data analysts 

Connect your Snowflake account or bring your own SQL to create your first end-to-end data product in 10 minutes. The application will guide you through all the steps starting from defining your data product, configuring your Snowflake account, confirming the auto-generated tests, and working with your personal AI Assistant to help you transform your data and publish an end-to-end data product. 


dbt Quickstart
 

Who’s it for? Data engineers and DataOps developers 

Take your dbt Core project with you. The application will guide you through dbt Core project migration and get you started with modern DataOps practices! It helps you automatically convert your dbt project to an end-to-end data pipeline. Inside your personal browser-based development environment DataOps.live Develop, you can work and experiment in a completely safe environment with your personal AI Assistant. 

Automation and ease of use is the prime directive for everything in DataOps.live Create. Create allows domain experts to start from the business case and service level agreement of their data product. Captured it in the definition stage. 

BlogPicture2_dp-create-workflow

 

Wait for it… in the design stage we allow the data analyst or product owner to do data transformation magic using DataOps.live Assist.  

Curious about the Data Science or Streamlit Data Product use cases? Hear about them at the launch webinar. Register now. 


About DataOps.live Assist 
 

Of course, simple and fast is not possible without the use of AI, and DataOps.live Create is no exception. Our new AI copilot simply called DataOps.live Assist, launches today with DataOps.live Create. 

Create consists of a two-stage workflow: defining your data product and designing your dataset. The definition describes the business case, picks the data and sets data trust expectations, and describes how your dataset looks like for your consumers. In the design stage, you simplify the data modelling process using generative AI. It’s a great experience for new and advanced developers and empowers data analysts wanting to get their hands on new technology. 

The DataOps development environment is a lightweight, guided experience for analysts and developers going through the data product checklist. Highly qualified engineers don’t have to spend time configuring their IDEs and library dependencies; instead they can get straight to fun. And you will not scare away your less technical personnel with tons of frustration when configuring work tools. It just happens in seconds.  

User-experience-wise, it’s the perfect moment to introduce DataOps.live Assist. The context of what the product owner wants to achieve, the scope of the data and the quality criteria are fed as context to the AI, giving the chat-based interface the extra juice the analyst or developer wants when building the dataset for you! 

With Assist data analysts or data product owners can interact with the LLM to build all the data transformation they need. Assist lets you fix any syntax mistakes. Ask it to help you brainstorm possible data insights, explain what is going on in the project, or to visualize a data model for you. It’s your personal buddy inside DataOps.live!  

BlogPicture3_generateddiagram

 

Once you are satisfied with the result you can quickly proceed and build the data product, which will get you to all necessary changes being delivered to your Snowflake feature branch database that we have created for you. This represents the safe environment where you can experiment and “break things,” but also share it with relevant stakeholders. The stakeholders can plug their BI tools or data science notebooks into it and give you feedback before it goes to production.  

Once everyone is happy, you can publish your data product. Publish will create a Merge Request. Sounds scary, I know, but what it means is that, as a business user, you’ll be redirected to a UI where we’ll generate a summary of changes and activities you did. Together with all relevant assets, you pass it over to the data engineering team to streamline the data pipeline in your production environment. 

It's been a lot of reading. It’s likely you just spent more time reading this article than it would take you to build your own data product with DataOps.live Create! 

Don’t believe me? Check it out now!

RELATED ARTICLES