Working with clean and accessible data is a treat. The insights you discover and occasionally unexpected conclusions that counter your initial assumptions are the fun stuff of data work!
But getting to that fun stage—that is hard work.
Building production grade data pipelines that ensure that you and your end users are interacting with high quality data can be a never-ending process, especially in larger organizations. It is an enormous amount of work!
The DataOps.live platform eliminates much of this technical burden and allows you to focus on rapid adoption of value-based data delivery. Our customers have used it to scale their data product factories to almost unbelievable velocity of deploying thousands of data products per quarter.
Today I want to share exciting news with you. We’re bringing some of the maker’s fun that data engineers have to data analysts and data product owners. Imagine if data product owners or data analysts could go and configure the very first data product through an easy chat with LLM. Now with DataOps.live, they can do just that!
Create is a new module in the DataOps.live platform that allows next level automation focused on simplicity. While the core platform allows reliability and scalability, Create allows simplicity and speed for customers, seamlessly moving to a scalable mode of operation when needed.
If you are a new customer and just started with Snowflake, you can immediately benefit from DataOps.live. To build your first “analytical data product,” start from your DDL (Data Definition Language) or the raw data layer of an existing database and end up with a production-ready pipeline. Alternatively, convert your dbt Core workloads with the dbt Quickstart.
Who’s it for? Data product owners and data analysts
Connect your Snowflake account or bring your own SQL to create your first end-to-end data product in 10 minutes. The application will guide you through all the steps starting from defining your data product, configuring your Snowflake account, confirming the auto-generated tests, and working with your personal AI Assistant to help you transform your data and publish an end-to-end data product.
dbt Quickstart
Who’s it for? Data engineers and DataOps developers
Take your dbt Core project with you. The application will guide you through dbt Core project migration and get you started with modern DataOps practices! It helps you automatically convert your dbt project to an end-to-end data pipeline. Inside your personal browser-based development environment DataOps.live Develop, you can work and experiment in a completely safe environment with your personal AI Assistant.
Automation and ease of use is the prime directive for everything in DataOps.live Create. Create allows domain experts to start from the business case and service level agreement of their data product. Captured it in the definition stage.
Wait for it… in the design stage we allow the data analyst or product owner to do data transformation magic using DataOps.live Assist.
Of course, simple and fast is not possible without the use of AI, and DataOps.live Create is no exception. Our new AI copilot, simply called DataOps.live Assist, launches today with DataOps.live Create.
Create consists of a two-stage workflow: defining your data product and designing your dataset. The definition describes the business case, picks the data and sets data trust expectations, and describes how your dataset looks like for your consumers. In the design stage, you simplify the data modelling process using generative AI. It’s a great experience for new and advanced developers and empowers data analysts wanting to get their hands on new technology.
The DataOps development environment is a lightweight, guided experience for analysts and developers going through the data product checklist. Highly qualified engineers don’t have to spend time configuring their IDEs and library dependencies; instead they can get straight to fun. And you will not scare away your less technical personnel with tons of frustration when configuring work tools. It just happens in seconds.
User-experience-wise, it’s the perfect moment to introduce DataOps.live Assist. The context of what the product owner wants to achieve, the scope of the data and the quality criteria are fed as context to the AI, giving the chat-based interface the extra juice the analyst or developer wants when building the dataset for you!
With Assist data analysts or data product owners can interact with the LLM to build all the data transformation they need. Assist lets you fix any syntax mistakes. Ask it to help you brainstorm possible data insights, explain what is going on in the project, or to visualize a data model for you. It’s your personal buddy inside DataOps.live!
Once you are satisfied with the result you can quickly proceed and build the data product, which will get you to all necessary changes being delivered to your Snowflake feature branch database that we have created for you. This represents the safe environment where you can experiment and “break things,” but also share it with relevant stakeholders. The stakeholders can plug their BI tools or data science notebooks into it and give you feedback before it goes to production.
Once everyone is happy, you can publish your data product. Publish will create a Merge Request. Sounds scary, I know, but what it means is that, as a business user, you’ll be redirected to a UI where we’ll generate a summary of changes and activities you did. Together with all relevant assets, you pass it over to the data engineering team to streamline the data pipeline in your production environment.
It's been a lot of reading. It’s likely you just spent more time reading this article than it would take you to build your own data product with DataOps.live Create!
Don’t believe me? Check it out now!