There’s no config or other set up required to run more than one scheduler-just start up a scheduler somewhere else (ensuring it has access to the DAG files) and it will cooperate with your existing schedulers through the database.įor more information, read the Scheduler HA documentation. To fully use this feature you need Postgres 9.6+ or MySQL 8+ (MySQL 5, and MariaDB won’t work with more than one scheduler I’m afraid). This is super useful for both resiliency (in case a scheduler goes down) and scheduling performance. It’s now possible and supported to run more than a single scheduler instance. Over at Astronomer.io we’ve benchmarked the scheduler-it’s fast (we had to triple check the numbers as we don’t quite believe them at first!) Scheduler is now HA compatible (AIP-15) Massive Scheduler performance improvementsĪs part of AIP-15 (Scheduler HA+performance) and other work Kamil did, we significantly improved the performance of the Airflow Scheduler. We now have a fully supported, no-longer-experimental API with a comprehensive OpenAPI specification ![]() From corators import dag, task from import days_ago ( default_args = () def load ( total_order_value : float ): print ( "Total order value is: %.2f " % total_order_value ) order_data = extract () order_summary = transform ( order_data ) load ( order_summary ) tutorial_etl_dag = tutorial_taskflow_api_etl () Fully specified REST API (AIP-32)
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |