Skip to content
#

workflow

Here are 3,303 public repositories matching this topic...

rpauli
rpauli commented Jan 24, 2022

Description

I am using Airflow to move data periodically to our datalake and noticed that the MySQLToS3Operator has tempalted fields and the DynamoDBToS3Operator doesn't. I found a semi awkward workaround but thought templated fields would be nice.
I supposed an implementation could be as simple as adding

template_fields = (
's3_bucket',
's3_key',
)

to the Operator

uday-mandava
uday-mandava commented Jan 23, 2022

Both successful workflows and stopped workflows are generating similar kind of events with 'reason' as 'WorkflowSucceeded'. There is no way to differentiate successful vs stopped events to trigger different down streams based on the status.

What change needs making?
Would like to see reason as 'workflowStopped' for stopped workflows.

Use Cases

We are using [Argo events](https://argoproj

kvnkho
kvnkho commented Dec 15, 2021

Current behavior

You get an error if you try to upload the same file name

azure.core.exceptions.ResourceExistsError: The specified blob already exists.
RequestId:5bef0cf1-b01e-002e-6

Proposed behavior

The task should take in an overwrite argument and pass it to [this line](https://github.com/PrefectHQ/prefect/blob/6cd24b023411980842fa77e6c0ca2ced47eeb83e/src/prefect/

SuiteCRM
github-pages-deploy-action

Improve this page

Add a description, image, and links to the workflow topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the workflow topic, visit your repo's landing page and select "manage topics."

Learn more