Make the Most of Your Cloud Data Warehouse

Agile Data Engine is a DataOps Platform used for building, deploying, and running cloud data warehouses. It operates and orchestrates data loads from your preferred cloud storage to your target database.

DataOps tools for your Cloud Data Warehouse

The SaaS DataOps platform forms a backbone for your data warehouse so that you can focus on building business value on top of it. You will get all this out-of-the-box with Agile Data Engine:

MFA-based User Authentication with optional Federated AD

Shared data product design experience for all of your data & analytics teams.

Continuous Delivery (CI/CD) with automatic pipelines to Dev, QA & Prod environments.

Customer process connectivity to your overall CI/CD and orchestration and also to your data catalogs through service layer APIs

Workflow orchestration & data quality monitoring.

Automatic schema changes and load/transform SQL & workflow code generation

Screenshot 2024-03-12 at 17.09.18

Agile Data Engine Architecture

Agile Data Engine's metadata-driven approach has three core concepts:

  • Entities - metadata object combining the data model and data loads into one. Information like keys and physical types are combined with permissions and load schedules.
  • Packages - collections of entities. A Package is also the unit of commit and deployment, flowing through the CI/CD pipeline.
  • Workflows - generated automatically based on logical data lineage and schedules. Entity loads are logical mappings between entities (and transformations), and the scheduling defines how often data is loaded.

Everything is designed in the same web-based user interface and an environment shared by all developers and teams working with your data warehouse.

The data models, load definitions, and other information about the data warehouse content are stored in a central metadata repository with related documentation.

The actual physical SQL code for models and loads is generated automatically based on the design metadata. Also, load workflows are generated dynamically using dependencies and schedule information provided by the developers.

More complex custom SQL transformations are placed as Load Steps as part of metadata, to gain from the overall automation and shared design experience.

Screenshot 2024-04-25 at 15.36.01

Data Products Developed as Packages

Screenshot 2024-04-25 at 15.39.52

Let's talk

Whether you're just getting started with a data transformation, are working to move from on-prem to cloud, or are curious to hear how we can help save you millions and grow your data warehouse's lifetime value, we're happy to chat!