Orchestrate
.procfwk

Logo

A cross tenant metadata driven processing framework for Azure Data Factory and Azure Synapse Analytics achieved by coupling orchestration pipelines with a SQL database and a set of Azure Functions.


- Overview
- Contents


View the Project on GitHub mrpaulandrew/procfwk

Execution Batches


« Contents


Batches operate within the processing framework as an optional level of execution that sit above stages. They can be represented with the following 3 tier hiearchy of execution:

The key feature of batches is to have concurrent parent pipeline executions configured using only metadata updates.

For context, batches can be used within the processing framework, for example, if you wish to trigger a set of stages/worker processes that fall or overlap within a given frequency. Hourly, daily or monthly.

At deplopment time it is expected that triggers will be configured separately within each orchestrator using different schedules, each hitting the framework parent pipeline, but with different ‘Batch Name’ parameter values passed according to the batch execution required.


Batch Pipeline Chain


A batch can further be defined with the following statements:


Using Batches

To use execution batches the following four updates should be made to processing framework:

  1. Set the database property ‘UseExecutionBatches’ to 1, true, enabled.
  2. Add batch information to the table [procfwk].[Batches].
  3. Add links between the execution batch and execution stages using the table [procfwk].[BatchStageLink].
  4. Trigger the parent pipeline providing the new batch name value as a pipeline parameter.

Please check out a demostration of this feature on my YouTube channel:

YouTube Demo Video