Day 1

PRODUCTION-READY

CI/CD Pipeline and workflow orchestration

250

DEPLOYMENTS

to Production in a Month

3 min

REFRESH INTERVAL

for business real-time data flows 

We provide DataOps automation for Your Cloud Data Warehouse

Agile Data Engine is a DataOps automation cloud for building, deploying, and running cloud data warehouses. It operates and orchestrates data loads from your preferred cloud storage to your target database.

Bring your data into a landing zone, and Agile Data Engine will take care of the data modeling and processing. Thanks to an innovative metadata-driven approach, your data never leaves your cloud environment. Agile Data Engine automatically creates and runs the SQL needed to transform and publish your data based on the metadata you define.

Snowflake_Logo
Amazon_Web_Services_Logo
Microsoft_Azure_Logo
cloud-logo

Agile Data Engine in six minutes

HubSpot Video

Agile Data Engine Architecture

dataops_cloud_architecture-Mar-09-2022-02-32-53-21-AM-1

Our best-of-breed DataOps Cloud (SaaS) provides the backbone for your data warehouse so that you can focus on building business value on top of it. You get all this out-of-the-box with Agile Data Engine:

  1. MFA-based User Authentication with optional Federated AD
  2. Shared design experience in the same Designer for all of your data & analytics engineering teams
  3. Continuous Delivery (CI/CD) with automatic pipelines to Dev/QA/Prod environments
  4. Customer process connectivity to your overall CI/CD and orchestration through service layer APIs
  5. Workflow orchestration & data quality monitoring
  6. Automatic schema changes and load/ transform SQL & workflow code generation

We combine data modeling and (E)LT processing and slice it into manageable packages

Agile Data Engine's metadata-driven approach has three core concepts:

  • Entities - metadata object combining the data model and data loads into one. Information like keys and physical types are combined with permissions and load schedules. 
  • Packages - collections of entities. A Package is also the unit of commit and deployment, flowing through the CI/CD pipeline.
  • Workflows - generated automatically based on logical data lineage and schedules. Entity loads are logical mappings between entities (and transformations), and the scheduling defines how often data is loaded.

Everything is designed in the same web-based user interface and an environment shared by all developers and teams working with your data warehouse.

The data models, load definitions, and other information about the data warehouse content are stored in a central metadata repository with related documentation.

The actual physical SQL code for models and loads is generated automatically based on the design metadata. Also, load workflows are generated dynamically using dependencies and schedule information provided by the developers.

More complex custom SQL transformations are placed as Load Steps as part of metadata, to gain from the overall automation and shared design experience.

combined_sliced_stacked-2-1

You can work with your favourite database

Agile Data Engine supports multiple DBMS services. With the metadata-driven approach, developers can concentrate more on the business logic. Agile Data Engine generates SQL automatically for the schema changes (and creations) and for the 1:1 load mappings.

However, the developers still have the flexibility to do custom solutions with powerful parameterization and custom SQL. The choice is yours.

Snowflake | Amazon Redshift | Azure SynapseAzure SQL DB | Google BigQuery