Skip to content
DataOps.live Professional EditionNEW
Purpose-built environment for small data teams and dbt Core developers.
DataOps.live Enterprise Edition
DataOps.live is the leading provider of Snowflake environment management, end-to-end orchestration, CI/CD, automated testing & observability, and code management, wrapped in an elegant developer interface.
Spendview for Snowflake FREE

An inexpensive, quick and easy way to build beautiful responsive website pages without coding knowledge.


Pricing and Edition

See whats included in our Professional and Enterprise Editions.

Getting Started
Docs- New to DataOps.liveStart learning by doing. Create your first project and set up your DataOps execution environment.
Join the Community
Join the CommunityFind answers to your DataOps questions, collaborate with your peers, share your knowledge!
#TrueDataOps Podcast
#TrueDataOps PodcastWelcome to the #TrueDataOps podcast with your host Kent Graziano, The Data Warrior!
Academy
DataOps AcademyEnroll in the DataOps.live Academy to take advantage of training courses. These courses will help you make the most out of DataOps.live.
Resource Hub
On-Demand Resources: eBooks, White Papers, Videos, Webinars

Learning Resources
A collection of resources to support your learning journey.

Customer stories
Events
Connect with fellow professionals, expand your network, and gain knowledge from our esteemed product and industry experts.
#TrueDataOps.org
#TrueDataOps.Org#TrueDataOps is defined by seven key characteristics or pillars:
Blogs
Stay informed with the latest insights from the DataOps team and the vibrant DataOps Community through our engaging DataOps blog. Explore updates, news, and valuable content that keep you in the loop about the ever-evolving world of DataOps.
In The News

In The News

Stay up-to-date with the latest developments, press releases, and news.
About Us
About UsFounded in 2020 with a vision to enhance customer insights and value, our company has since developed technologies focused on DataOps.
Careers

Careers

Join the DataOps.live team today! We're looking for colleagues on our Sales, Marketing, Engineering, Product, and Support teams.
Kent Graziano - The Data Warrior + Patrick Connolly, DataOps.live EvangelistOct 3, 2022 9:04:52 AM6 min read

The Hype:  Data Mesh will Become Obsolete — Really?

In the summer of 2022, Gartner® published its Hype Cycle on Data Management, and this was not without some controversy.
Check out this introduction if you’re unfamiliar with the Gartner Hype Cycle model.

In particular, it asserted that Data Mesh would be obsolete before it reaches what Gartner calls the Plateau of Productivity. It is hard to fathom how they came to that conclusion when there is such a vibrant Data Mesh Community complete with a very active Slack channel, and when DataOps.live customers like Roche and OneWeb have successfully built a Data Mesh using DataOps.live and Snowflake. But we’ll come back to that in a moment.

Data Mesh is, in essence, a conceptual approach and architectural framework, not a specific off-the-shelf tool or technology. One of the central precepts for its creation is angled towards improving productivity and time to value by eliminating the perceived bottlenecks experienced by organizations building out large-scale enterprise data warehouses and data lakes. Zamak Deghani, the creator of the Data Mesh concept, has referred to it as a decentralized sociotechnical approachmeaning it involves people, processes, and technology. So, it seems likely that no two implementations will be the same by their nature. How one organization defines productivity based on its specific needs and outputs on its application of Data Mesh will differ from another. Yes, Data Mesh is (kind of) new, and yes, it is maturing, but that does not mean we should ignore or discount it this early in the game.

While Data Mesh appears as a new entrant in the Innovation Trigger section of the Hype Cycle, it’s already flagged as obsolete. (It should be noted that the Hype Cycle does use a rather flowery language, from the Peak of Inflated Expectations through the Trough of Disillusionment to the Slope of Enlightenment).

Of course, this is Gartner’s view, but we believe it is not indicative of true industry-wide sentiment and might just be a premature categorization. To be clear, Gartner is not saying it is obsolete now, but rather that the concept will be obsolete and perhaps replaced by a newer concept before it reaches critical mass and is widely adopted. In one conversation, the discussion when pointed to a broad time windowbeyond a decade! Data Fabric is the concept they expect it to be subsumed into.

That may or may not prove to be true. Time will tell. But that does not mean it has no current value or should not be adopted today, especially since there are already some very large-scale successes.

Perhaps not surprisinglyespecially given the number of Data Mesh projects out there focused on self-service domain-led data hubs with federated governancethe assertion sparked a lively discussion on LinkedIn. Gartner maintains the Hype Cycle was a tool we apply to assess where technologies fall on the cycle as they mature. Speed varies, and some technologies become “obsolete before reaching the plateau,” but it remains usefully descriptive across IT categories.’

Many comments centered on the role of Data Fabric regarding Data Mesh, such as the latter not being possible without the former and the value-add provided by Data Fabric in terms of end-to-end integration of data pipelines and cloud environments, using intelligent and automated systems. That is, Data Fabric is, to a degree, an architectural enabler of Data Mesh in a world of big data, the drive for unified data/data management ecosystems, and the need for governance, all in pursuit of the data-driven enterprise. Data Fabric and Data Mesh are complementary, not exclusive. And while Data Fabrics may still be in their relative infancy (and according to Gartner, already in the Trough of Disillusionment), we see this approach in our professional practice and, indeed, the holistic views they enable are part-and-parcel of #TrueDataOps and its practical expression in the DataOps.live platform.

We do need to differentiate between hype and reality. And there are real dangers here regarding the language people use and how technologies may be presented or misrepresented. As Zhamak Dehghani wrote in a recent LinkedIn thread, ‘Unfortunately the misrepresentation/misunderstanding of Data Mesh (not data mesh networks) by Gartner is further damaging the trust of engineers and SMEs in the brand and surfaces the discord of analysts’ point of view with the realities on the ground. I do really hope this can be addressed at some point in the future.’ Use cases, reporting those ‘realities on the ground’, are key.

This brings us back to the real-world experiences of Roche Diagnostics with DataOps.live. 

Part of the world’s biggest biotech company, Roche Diagnostics, wanted a more data-driven business, with product teams able to react and adapt quicker, delivering higher quality data products faster while maintaining governance and security. ‘You can’t carry on doing the same things and expect different results,’ said Omar Khawaja, Global Head of Business Intelligence at Roche Diagnostics. ‘We wanted to move the needle further on the dial. I chose to pursue a forward-looking data stack and, on the architecture side, a novel Data Mesh approach.’ This domain-oriented self-service concept would; he said, ‘address the critical issues around our people and our unique decentralized culture, to empower and give ownership to teams, creating and using our tech stack in a very different way while still having the federated governance on top that we need.’

Adoption was fast. Really fast. Crucially, the average time to deliver a new data product (one of the critical KPIs in Data Mesh) has now fallen from six months to only 6-8 weeks. The number of monthly releases has increased to over 120 compared to just one release every three months prior to Data Mesh. The new platform and Data Mesh framework (built on Snowflake and DataOps.live) have enabled the integration of more than 15 additional capabilities and partners into the data platform. And notably, they have enabled 40 separate data domain teams to contribute to the overall Data Mesh with 50 individual data products. And finally, over $40m in savings has been delivered so far in inventory reduction, cost avoidance, and resource optimization. 

And yet, what is the Gartner Hype Cycle benefit rating for Data Mesh? ‘Low.’ On the same note, Gartner describes DataOps as one of ‘the most hyped/less mature’ technologies in data management today. We’ve seen tremendous benefits with our customers, including a recent citation from Roche’s domain teams highlighting a $40m savings with the project. 

Metrics such as those above suggest that a number of our customers have indeed reached their own Plateau of Productivity. The reality is that it’s continuing to move further, and Roche is not the only one: innovative satellite communications provider OneWeb is also deploying Data Mesh to better manage and optimize its operations in an incredibly complex area. 

So, from where we sit, Gartner may be a bit premature in its pronouncement about Data Mesh. Yes, it is new, not fully defined in some aspects, evolving, and truly still in the innovation stage. It seems way too early to say it will be obsolete before mass adoption. 

Success breeds success. If their competition is paying attention, we expect we will see faster and broader adoption of Data Mesh than Gartner may have expected.

If you’re interested in learning more, you can check out the detailed Roche case study or contact one of our team members. We also have a range of Solution Briefs, including Data Mesh and an introduction to DataOps.live.

RELATED ARTICLES