airflow dags command line<\/strong> provides:<\/p>\n\n\n\nairflow dags -h\nusage: airflow dags [-h] COMMAND ...\n\nManage DAGs\n\npositional arguments:\n COMMAND\n backfill Run subsections of a DAG for a specified date range\n delete Delete all DB records related to the specified DAG\n list List all the DAGs\n list-jobs List the jobs\n list-runs List DAG runs given a DAG id\n next-execution\n Get the next execution datetimes of a DAG\n pause Pause a DAG\n report Show DagBag loading report\n show Displays DAG's tasks with their dependencies\n state Get the status of a dag run\n test Execute one single DagRun\n trigger Trigger a DAG run\n unpause Resume a paused DAG\n\noptional arguments:\n -h, --help show this help message and exit<\/pre>\n\n\n\nWe can list all dags, delete a dag, list jobs inside dags, etc. Let’s try some of the most common DAG commands.<\/p>\n\n\n\n
Airflow list all DAG<\/h3>\n\n\n\n The dags list command<\/strong> lists all the DAG in airflow. It will show you the DAG owner and the status where the job is paused or active.<\/p>\n\n\n\nairflow dags list\ndag_id | filepath | owner | paused\n========================================+==================================================================================================================+=========+=======\nairflow_slack_notification_tutorial | test_slack_alert.py | airflow | False\nemail_tutorial | testemail.py | airflow | True\nexample_bash_operator | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_bash_operator.py | airflow | True\nexample_branch_datetime_operator_2 | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_branch_datetime_operator.py | airflow | True\nexample_branch_dop_operator_v3 | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_branch_python_dop_operator_3.py | airflow | True\nexample_branch_labels | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_branch_labels.py | airflow | True\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\ntutorial_taskflow_api_etl_virtualenv | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/tutorial_taskflow_api_etl_virtualenv.py | airflow | True<\/pre>\n\n\n\nAirflow list-jobs command<\/h3>\n\n\n\n dags list-jobs<\/strong> List the jobs inside a DAG. Let’s run this command and verify the output.<\/p>\n\n\n\nairflow dags list-jobs -d airflow_slack_notification_tutorial\ndag_id | state | job_type | start_date | end_date\n====================================+=========+==============+==================================+=================================\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:22:22.697377+00:00 | 2021-05-28 14:22:23.924578+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:22:19.270604+00:00 | 2021-05-28 14:22:22.514622+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:15.783816+00:00 | 2021-05-28 14:17:17.621517+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:12.587375+00:00 | 2021-05-28 14:17:15.417111+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:11.581316+00:00 | 2021-05-28 14:17:14.024990+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:06.681462+00:00 | 2021-05-28 14:17:10.110889+00:00<\/pre>\n\n\n\nAirflow list runs command<\/h3>\n\n\n\n dags list-runs command<\/strong>\u00a0takes the DAG id as input and lists DAG runs of a given DAG id. If the user provides the state option and the dags list-runs command, it will only search for all the dag runs with the given state.<\/p>\n\n\n\nairflow dags list-runs -d airflow_slack_notification_tutorial\ndag_id | run_id | state | execution_date | start_date | end_date\n====================================+=============================================+========+==================================+==================================+=================================\nairflow_slack_notification_tutorial | manual__2021-05-28T14:14:19.651888+00:00 | pass | 2021-05-28T14:14:19.651888+00:00 | 2021-05-28T14:14:19.697786+00:00 | 2021-05-28T14:22:24.931743+00:00\nairflow_slack_notification_tutorial | scheduled__2021-05-27T14:16:56.676522+00:00 | pass | 2021-05-27T14:16:56.676522+00:00 | 2021-05-28T14:16:58.075571+00:00 | 2021-05-28T14:22:23.482804+00:00<\/pre>\n\n\n\nAirflow next Dag execution command<\/h3>\n\n\n\n The dags next-execution command<\/strong> displays the next execution datetimes of a DAG. If you wish to get more than 1 execution datetimes pass -n parameter.<\/p>\n\n\n\nairflow dags next-execution email_tutorial -n 2\n2021-05-29 13:57:56.965112+00:00\n2021-05-30 13:57:56.965112+00:00<\/pre>\n\n\n\nNow let’s run the subsequent execution in a stopped DAg and verify the output.<\/p>\n\n\n\n
airflow dags next-execution email_tutorial -n 1\n[INFO] Please be reminded this DAG is PAUSED now.\n2021-05-24 10:21:15.464155+00:00<\/pre>\n\n\n\nAs you can see, we got the next execution information, but we got an INFO.<\/p>\n\n\n\n
Airflow DAG report command<\/h3>\n\n\n\n The dag report command<\/strong> will show the dag loading report. This command shows you helpful information like the File location or directory to look for the dag.<\/p>\n\n\n\nairflow dags report\nfile | duration | dag_num | task_num | dags\n==================================================================================+================+=========+==========+===================================================================================\n\/test_slack_alert.py | 0:00:00.041452 | 1 | 2 | airflow_slack_notification_tutorial\n\/testemail.py | 0:00:00.024659 | 1 | 1 | email_tutorial\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_sub | 0:00:00.017983 | 3 | 15 | example_subdag_operator,example_subdag_operator.section-1,example_subdag_operator.\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\nnch_labels.py | | | |\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/test_utils. | 0:00:00.002076 | 1 | 1 | test_utils\npy | | | |\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/tutorial_et | 0:00:00.001613 | 1 | 3 | tutorial_etl_dag\nl_dag.py | | | |\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/subdags\/sub | 0:00:00.001112 | 0 | 0 |\ndag.py | | | |\n<\/pre>\n\n\n\nAirflow DAG show command<\/h3>\n\n\n\n The Dags show command<\/strong> displays the complete DAG information and its dependencies.<\/p>\n\n\n\nairflow dags show airflow_slack_notification_tutorial\ndigraph airflow_slack_notification_tutorial {\n\tgraph [label=airflow_slack_notification_tutorial labelloc=t rankdir=LR]\n\tsimple_bash_task [color=\"#000000\" fillcolor=\"#f0ede4\" label=simple_bash_task shape=rectangle style=\"filled,rounded\"]\n\tslack_notification [color=\"#000000\" fillcolor=\"#f4a460\" label=slack_notification shape=rectangle style=\"filled,rounded\"]\n\tsimple_bash_task -> slack_notification\n}<\/pre>\n\n\n\nTest run DAG in airflow<\/h3>\n\n\n\n Testing a DAG before running is a quick method to determine whether DAG is working as expected. We can test the airflow DAG by running the dags test<\/strong>.<\/p>\n\n\n\nLet’s test the below .py file with the command line.<\/p>\n\n\n\n
from datetime import timedelta\n\nfrom airflow import DAG\nfrom airflow.operators.bash import BashOperator\nfrom airflow.operators.dummy import DummyOperator\nfrom airflow.utils.dates import days_ago\n\nargs = {\n 'owner': 'airflow',\n}\n\nwith DAG(\n dag_id='example_bash_operator',\n default_args=args,\n schedule_interval='0 0 * * *',\n start_date=days_ago(2),\n dagrun_timeout=timedelta(minutes=60),\n tags=['example', 'example2'],\n params={\"example_key\": \"example_value\"},\n) as dag:\n\n run_this_last = DummyOperator(\n task_id='run_this_last',\n )\n\n # [START howto_operator_bash]\n run_this = BashOperator(\n task_id='run_after_loop',\n bash_command='echo 1',\n )\n # [END howto_operator_bash]\n\n run_this >> run_this_last\n\n for i in range(3):\n task = BashOperator(\n task_id='runme_' + str(i),\n bash_command='echo \"{{ task_instance_key_str }}\" && sleep 1',\n )\n task >> run_this\n\n # [START howto_operator_bash_template]\n also_run_this = BashOperator(\n task_id='also_run_this',\n bash_command='echo \"run_id={{ run_id }} | dag_run={{ dag_run }}\"',\n )\n # [END howto_operator_bash_template]\n also_run_this >> run_this_last\n\n# [START howto_operator_bash_skip]\nthis_will_skip = BashOperator(\n task_id='this_will_skip',\n bash_command='echo \"hello world\"; exit 99;',\n dag=dag,\n)\n# [END howto_operator_bash_skip]\nthis_will_skip >> run_this_last\n\nif __name__ == \"__main__\":\n dag.cli()<\/pre>\n\n\n\nCreate a DAG by copy-pasting the below code in a .py file and running the dags test <\/strong>command<\/p>\n\n\n\nairflow dags test example_bash_operator 2021-01-01\n[2021-05-29 17:48:53,823] {dagbag.py:487} INFO - Filling up the DagBag from \/opt\/airflow\/dags\n[2021-05-29 17:48:54,494] {base_executor.py:82} INFO - Adding to queue: ['<TaskInstance: example_bash_operator.runme_0 2021-01-01 00:00:00+00:00 [queued]>']\n[2021-05-29 17:48:54,555] {base_executor.py:82} INFO - Adding to queue: ['<TaskInstance: example_bash_operator.runme_1 2021-01-01 00:00:00+00:00 [queued]>']\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\n20210529T174904\n[2021-05-29 17:49:04,574] {taskinstance.py:1245} INFO - 0 downstream tasks scheduled from follow-on schedule check\n[2021-05-29 17:49:04,617] {dagrun.py:444} INFO - Marking run <DagRun example_bash_operator @ 2021-01-01 00:00:00+00:00: backfill__2021-01-01T00:00:00+00:00, externally triggered: False> successful\n[2021-05-29 17:49:04,631] {backfill_job.py:388} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 1 | succeeded: 5 | running: 0 | failed: 0 | skipped: 1 | deadlocked: 0 | not ready: 1\n[2021-05-29 17:49:09,414] {backfill_job.py:388} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 0 | succeeded: 5 | running: 0 | failed: 0 | skipped: 2 | deadlocked: 0 | not ready: 0\n[2021-05-29 17:49:09,427] {backfill_job.py:831} INFO - Backfill done. Exiting.<\/pre>\n\n\n\nHere the DAG run passed without any issues.<\/p>\n\n\n\n
Airflow delete a DAG<\/h3>\n\n\n\n Sometimes we have a delete a DAG that is no longer required. We can delete a DAG by the dags delete<\/strong> command. Let’s try to delete a DAG.<\/p>\n\n\n\nairflow dags delete test_utils -y\n[2021-05-29 13:46:32,100] {__init__.py:38} INFO - Loaded API auth backend: <module 'airflow.api.auth.backend.basic_auth' from '\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/api\/auth\/backend\/basic_auth.py'>\n[2021-05-29 13:46:32,131] {delete_dag.py:42} INFO - Deleting DAG: test_utils\nRemoved 2 record(s)<\/pre>\n\n\n\nPass -y=This will drop all existing records related to the specified DAG<\/p>\n\n\n\n
Airflow Tasks command<\/h2>\n\n\n\n tasks command<\/strong> helps us manage tasks. With the tasks command, we can run a task, test a task, check the task’s status, and perform many more operations.<\/p>\n\n\n\ndefault@a4bd0ae3c9a0:\/opt\/airflow$ airflow tasks -h\nusage: airflow tasks [-h] COMMAND ...\n\nManage tasks\n\npositional arguments:\n COMMAND\n clear Clear a set of task instance, as if they never ran\n failed-deps Returns the unmet dependencies for a task instance\n list List the tasks within a DAG\n render Render a task instance's template(s)\n run Run a single task instance\n state Get the status of a task instance\n states-for-dag-run\n Get the status of all task instances in a dag run\n test Test a task instance\n\noptional arguments:\n -h, --help show this help message and exit<\/pre>\n\n\n\nLet’s try a few of the task commands.<\/p>\n\n\n\n
Airflow list all task within DAG<\/h3>\n\n\n\n The tasks list command<\/strong> takes the DAG name as a parameter and lists all the tasks present in the DAG.<\/p>\n\n\n\nairflow tasks list example_bash_operator\nalso_run_this\nrun_after_loop\nrun_this_last\nrunme_0\nrunme_1\nrunme_2\nthis_will_skip<\/pre>\n\n\n\nAirflow Task run<\/h3>\n\n\n\n The task run command<\/strong> helps us to run any task present in DAG. The run command takes the below arguments.<\/p>\n\n\n\npositional arguments:\n dag_id The id of the dag\n task_id The id of the task\n execution_date The execution date of the DAG<\/code><\/pre>\n\n\n\nLet’s run a simple task using the command line<\/p>\n\n\n\n
airflow tasks run example_bash_operator runme_0 2021-01-01\n[2021-05-29 15:59:22,239] {dagbag.py:487} INFO - Filling up the DagBag from \/opt\/***\/dags\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\nRunning <TaskInstance: example_bash_operator.runme_0 2021-01-01T00:00:00+00:00 [success]> on host a4bd0ae3c9a0<\/pre>\n\n\n\nAirflow check the status of a task<\/h3>\n\n\n\n with the help of the tasks state command; we<\/strong> can check the status of a particular task.<\/p>\n\n\n\nairflow tasks state example_bash_operator runme_0 2021-01-01\nsuccess<\/pre>\n\n\n\nAirflow database check command<\/h2>\n\n\n\n Airflow depends on a database to save its metadata. We can quickly check if the database is reachable with the DB check command.<\/p>\n\n\n\n
airflow db check\n[2021-05-29 15:43:02,284] {db.py:776} INFO - Connection successful.<\/pre>\n\n\n\nAirflow Jobs command<\/h2>\n\n\n\n With job command<\/strong>, we can easily manage jobs in airflow. Let’s run this command and verify if there are any active jobs.<\/p>\n\n\n\nairflow jobs check\nFound one alive job.<\/pre>\n\n\n\nRoles command in airflow<\/h2>\n\n\n\n With the help of the roles command<\/strong>, we can easily create and list roles in airflow. Let’s see that role command in action.<\/p>\n\n\n\nAirflow list roles<\/h3>\n\n\n\n roles list<\/strong> <\/strong>command lists all the roles available in the airflow instance.<\/p>\n\n\n\nairflow roles list\nname\n======\nAdmin\nOp\nPublic\nUser\nViewer<\/pre>\n\n\n\nUsers command in airflow<\/h2>\n\n\n\n Users is another handy command which enables us to Manage users. Let’s see a few users’ commands in action.<\/p>\n\n\n\n
Airflow users list command <\/h3>\n\n\n\n The user’s list command lists all the users.<\/p>\n\n\n\n
airflow users list\nid | username | email | first_name | last_name | roles\n===+==========+==========================+============+===========+======\n1 | airflow | airflowadmin@example.com | Airflow | Admin | Admin<\/pre>\n\n\n\nAirflow create a user<\/h3>\n\n\n\n We can also create a user using the user create command.<\/p>\n\n\n\n
Let’s see what the required parameters are:<\/p>\n\n\n\n
airflow users create -h\nusage: airflow users create [-h] -e EMAIL -f FIRSTNAME -l LASTNAME\n [-p PASSWORD] -r ROLE [--use-random-password] -u\n USERNAME\n\nCreate a user\n\noptional arguments:\n -h, --help show this help message and exit\n -e EMAIL, --email EMAIL\n Email of the user\n -f FIRSTNAME, --firstname FIRSTNAME\n First name of the user\n -l LASTNAME, --lastname LASTNAME\n Last name of the user\n -p PASSWORD, --password PASSWORD\n Password of the user, required to create a user without --use-random-password\n -r ROLE, --role ROLE Role of the user. Existing roles include Admin, User, Op, Viewer, and Public\n --use-random-password\n Do not prompt for password. Use random string instead. Required to create a user without --password\n -u USERNAME, --username USERNAME\n Username of the user<\/pre>\n\n\n\nNow create a test user using the users create command. If the password is not specified, the user will get prompted for a user password.<\/p>\n\n\n\n
airflow users create -r User -u test -e test@test.com -f test_first_name -l test_last_name -p test\nUser user test created\n\nairflow users list\nid | username | email | first_name | last_name | roles\n===+==========+==========================+=================+================+======\n1 | airflow | airflowadmin@example.com | Airflow | Admin | Admin\n2 | test | test@test.com | test_first_name | test_last_name | User\n<\/pre>\n\n\n\nAirflow Variables command<\/h2>\n\n\n\n With the variable command<\/strong>, we can easily manage the variables. Let’s use a few variable commands.<\/p>\n\n\n\nList variables in airflow<\/h3>\n\n\n\n The variables list lists all the variable keys in airflow.<\/p>\n\n\n\n
airflow variables list\nkey\n====\ntest<\/pre>\n\n\n\nif you wish to get a variable value, please use the below command<\/p>\n\n\n\n
airflow variables get test\ntest_value<\/pre>\n\n\n\nYou can check the user from the airflow user interface as well<\/p>\n\n\n\n <\/figure>\n\n\n\nAirflow create a variable<\/h3>\n\n\n\n with the variables set; we<\/strong> can easily create variables in airflow. The set takes the variable key and variable value as a parameter.<\/p>\n\n\n\nairflow variables set variable_key variable_value\n[2021-05-29 16:15:48,093] {crypto.py:82} WARNING - empty cryptography key - values will not be stored encrypted.\nVariable variable_key created\n\nairflow variables list\nkey\n============\ntest\nvariable_key<\/pre>\n\n\n\nConclusion<\/h2>\n\n\n\n Finally, we have come to an end to this trying tutorial. We started with the airflow setup command, and we learned how to manage DAGs and tasks in airflow. At last, we learned how to manage users, roles, and variables in airflow. I hope you have found this article useful. Please do let me know in the comment box if you face any issues with the above commands. Happy learning.<\/p>\n\n\n\n
More to Read?<\/h2>\n\n\n\n How to send email from airflow<\/a><\/p>\n\n\n\nHow to integrate airflow with slack<\/a><\/p>\n\n\n\nInstall airflow using the docker<\/a><\/p>\n\n\n\n<\/p>\n","protected":false},"excerpt":{"rendered":"
Apache airflow is an excellent open-source tool that lets you manage and run a complex data pipeline. Airflow has a straightforward user interface as well, using which we can easily …<\/p>\n
Airflow command line | Complete Airflow command in 2023<\/span> Read More \u00bb<\/a><\/p>\n","protected":false},"author":1,"featured_media":32594,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"default","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"","footnotes":""},"categories":[41,68],"tags":[42,72],"uagb_featured_image_src":{"full":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",1200,628,false],"thumbnail":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-150x150.webp",150,150,true],"medium":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-300x157.webp",300,157,true],"medium_large":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-768x402.webp",768,402,true],"large":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-1024x536.webp",1024,536,true],"1536x1536":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",1200,628,false],"2048x2048":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",1200,628,false],"web-stories-poster-portrait":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",640,335,false],"web-stories-publisher-logo":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",96,50,false],"web-stories-thumbnail":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",150,79,false]},"uagb_author_info":{"display_name":"naivetechblog@gmail.com","author_link":"https:\/\/naiveskill.com\/author\/naivetechbloggmail-com\/"},"uagb_comment_info":2,"uagb_excerpt":"Apache airflow is an excellent open-source tool that lets you manage and run a complex data pipeline. Airflow has a straightforward user interface as well, using which we can easily … Airflow command line | Complete Airflow command in 2023 Read More \u00bb","_links":{"self":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts\/25940"}],"collection":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/comments?post=25940"}],"version-history":[{"count":1,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts\/25940\/revisions"}],"predecessor-version":[{"id":36791,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts\/25940\/revisions\/36791"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/media\/32594"}],"wp:attachment":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/media?parent=25940"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/categories?post=25940"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/tags?post=25940"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}