Flow Flow

yaml
type: "io.kestra.plugin.core.trigger.Flow"

Trigger a flow in response to a state change in one or more other flows.

You can trigger a flow as soon as another flow ends. This allows you to add implicit dependencies between multiple flows, which can often be managed by different teams.

Examples

Trigger the transform flow after the extract flow finishes successfully. The extract flow generates a last_ingested_date output that is passed to the transform flow as an input. Here is the extract flow:

yaml
id: extract
namespace: company.team

tasks:
  - id: final_date
    type: io.kestra.plugin.core.debug.Return
    format: "{{ execution.startDate | dateAdd(-2, 'DAYS') | date('yyyy-MM-dd') }}"

outputs:
  - id: last_ingested_date
    type: STRING
    value: "{{ outputs.final_date.value }}"

Below is the transform flow triggered in response to the extract flow's successful completion.

yaml
id: transform
namespace: company.team

inputs:
  - id: last_ingested_date
    type: STRING
    defaults: "2025-01-01"

variables:
  result: |
    Ingestion done in {{ trigger.executionId }}.
    Now transforming data up to {{ inputs.last_ingested_date }}

tasks:
  - id: run_transform
    type: io.kestra.plugin.core.debug.Return
    format: "{{ render(vars.result) }}"

  - id: log
    type: io.kestra.plugin.core.log.Log
    message: "{{ render(vars.result) }}"

triggers:
  - id: run_after_extract
    type: io.kestra.plugin.core.trigger.Flow
    inputs:
      last_ingested_date: "{{ trigger.outputs.last_ingested_date }}"
    conditions:
      - type: io.kestra.plugin.core.condition.ExecutionFlowCondition
        namespace: company.team
        flowId: extract
      - type: io.kestra.plugin.core.condition.ExecutionStatusCondition
        in:
          - SUCCESS

Properties

conditions

  • Type: array
  • SubType: Condition
  • Dynamic:
  • Required:

List of conditions in order to limit the flow trigger.

inputs

  • Type: object
  • Dynamic:
  • Required:

Pass upstream flow's outputs to inputs of the current flow.

The inputs allow you to pass data object or a file to the downstream flow as long as those outputs are defined on the flow-level in the upstream flow.

states

  • Type: array
  • SubType: string
  • Dynamic:
  • Required:
  • Default: [SUCCESS, WARNING, FAILED, KILLED, CANCELLED, RETRIED, SKIPPED]

List of execution states that will be evaluated by the trigger

By default, only executions in a terminal state will be evaluated. Any ExecutionStatusCondition-type condition will be evaluated after the list of states.

stopAfter

  • Type: array
  • SubType: string
  • Dynamic:
  • Required:

List of execution states after which a trigger should be stopped (a.k.a. disabled).

Outputs

executionId

  • Type: string
  • Required: ✔️

The execution ID that triggered the current flow.

flowId

  • Type: string
  • Required: ✔️

The flow ID whose execution triggered the current flow.

flowRevision

  • Type: integer
  • Required: ✔️

The flow revision that triggered the current flow.

namespace

  • Type: string
  • Required: ✔️

The namespace of the flow that triggered the current flow.

state

  • Type: string
  • Required: ✔️
  • Possible Values:
    • CREATED
    • RUNNING
    • PAUSED
    • RESTARTED
    • KILLING
    • SUCCESS
    • WARNING
    • FAILED
    • KILLED
    • CANCELLED
    • QUEUED
    • RETRYING
    • RETRIED
    • SKIPPED

The execution state.

Definitions

Was this page helpful?