How to Build a Successful Cloud DataOps Program
The major cloud platform vendors offer many valuable tools that support analytics and data science. If you are looking to enable DataOps in your organization, you may be tempted to use these tools. However, relying on cloud DevOps and workflow tools will not fully enable DataOps. In fact, relying on these tools alone will leave significant gaps in your DataOps program.
In this webinar, we’ll discuss the core capabilities required for a successful DataOps program and how a DataOps platform complements the tools provided by cloud vendors, ensuring that you get the most out of your cloud investment. We’ll also specifically cover how to achieve the following DataOps functions in the cloud:
Meta-orchestration of both production & development pipelines;
Testing & monitoring across the entire analytic system;
Automated provisioning & control of self-service data environments for analytic development work;
Simple, automated deployment across complex systems;
The ability to share reusable data components for enhanced collaboration; &
Process analytics to measure process improvements.
About the Speaker
Chris Bergh is CEO & Head Chef at DataKitchen, a DataOps software & services startup. He has more than 30 years of research, software engineering, data analytics & executive management experience. At various points in his career, he has been a COO, CTO, VP & Director of Engineering. Chris is a recognized expert on DataOps. He is the co-author of The DataOps Manifesto, The DataOps Cookbook & Recipes for DataOps Success & a speaker on DataOps at many industry conferences. You can follow Chris on Twitter @ChrisBergh.