Dataset Consumer Triggering Producers #28229
Replies: 1 comment 5 replies
-
I am not sure if I understand your use case. I have a feeling that you want to mix producer/consumer approach with "traditional" behaviour of Airlfow from before datasets where you would trigger a number of tasks, they'd produce the outputs and then their success would trigger the final DAG. This is just "classic" DAG from Airflow where datasets should not be involved. But I might be completely misunderstanding you, your description was a bit incomplete or maybe just I miss some assumptions you have in your head. Maybe try to visualise your approach and explain it better (but I would encourage you to maybe look first if "classic" approach of Airlfow will not fit your needs better). |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Good Afternoon Everybody,
Is it possible to have a consumer of a dataset block and trigger the producers dataset?
Aggregation Dag:
File 1 Dag produces File 1
File 2 Dag produces File 2
Starting the aggregation dag would trigger File 1 dag when it reached node 1 then File 2 dag when it reached node 2. I guess this would be similar to lazy loading classes.
We have tons of files created by mathmatical models and our workflow is often aggregating all the different models into a final report. We could generate each model based on a clock and have the aggregation run when each of the producers are done but it would be great if I could start the final report and have it resolve and trigger it's dependencies.
Thank you,
Ben
ps. If someone could point me to the right place in the code then i might be able to write the solution myself.
Beta Was this translation helpful? Give feedback.
All reactions