{"id":25940,"date":"2021-05-30T08:35:31","date_gmt":"2021-05-30T08:35:31","guid":{"rendered":"https:\/\/naiveskill.com\/?p=25940"},"modified":"2023-02-04T08:34:20","modified_gmt":"2023-02-04T08:34:20","slug":"airflow-command-line","status":"publish","type":"post","link":"https:\/\/naiveskill.com\/airflow-command-line\/","title":{"rendered":"Airflow command line | Complete Airflow command in 2023"},"content":{"rendered":"\n

Apache airflow<\/a><\/strong> is an excellent open-source tool that lets you manage and run a complex data pipeline. Airflow has a straightforward user interface as well, using which we can easily manage DAG and can add users and configuration. Airflow<\/strong> also has a very rich UI<\/strong> that allows for many types of operation on a DAG. We can also perform various admin tasks with the help of the airflow command line<\/strong>(CLI).<\/p>\n\n\n\n

In this tutorial, I will be explaining how to use the airflow command-line interface. Please follow this<\/a> link to set up the airflow<\/strong>.<\/p>\n\n\n\n

Check airflow version<\/h2>\n\n\n\n

To check the airflow version, connect to the server where airflow is installed and type the below command. To get the latest features, you recommended using the latest airflow version.<\/p>\n\n\n\n

airflow version\n2.1.0<\/pre>\n\n\n\n

Airflow config<\/h2>\n\n\n\n

With the help of the airflow config list, you will get complete information about the airflow configs.<\/p>\n\n\n\n

Config command<\/strong> gives information about the DAG folders, logging, metrics, API, etc. It is a compelling command if you wish to verify\/check your airflow configurations.<\/p>\n\n\n\n

airflow config list\n[core]\ndags_folder = \/opt\/airflow\/dags\nhostname_callable = socket.getfqdn\ndefault_timezone = utc\nexecutor = CeleryExecutor\nsql_alchemy_conn = postgresql+psycopg2:\/\/airflow:airflow@postgres\/airflow\nsql_engine_encoding = utf-8\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\n[smart_sensor]\nuse_smart_sensor = False\nshard_code_upper_limit = 10000\nshards = 5\nsensors_enabled = NamedHivePartitionSensor<\/pre>\n\n\n\n

Airflow initdb command<\/h2>\n\n\n\n

The initdb command will initialize the airflow database. We generally used this command while setting up the airflow the first time.<\/p>\n\n\n\n

I am pasting the output of the initdb command just for reference.<\/p>\n\n\n\n

airflow initdb\n[2020-01-01 21:49:21,603] {settings.py:252} INFO - settings.configure_orm(): Using pool settings. pool_size=5, max_overflow=10, pool_recycle=1800, pid=20917\nDB: postgresql+psycopg2:\/\/airflow@localhost:5432\/airflow_mdb\n[2020-01-04 20:19:22,257] {db.py:368} INFO - Creating tables\nINFO  [alembic.runtime.migration] Context impl PostgresqlImpl.\nINFO  [alembic.runtime.migration] Will assume transactional DDL.\nDone.<\/pre>\n\n\n\n

Airflow Resetdb command<\/h2>\n\n\n\n

reset DB command<\/strong> will delete all records from the metadata database, including all DAG runs, Variables, and Connections. Do not run this command after your airflow instance is successfully set up; otherwise, you will lose the entire airflow metadata.<\/p>\n\n\n\n

Airflow Connections command<\/h2>\n\n\n\n

Airflow encrypts your passwords in the connection,<\/strong> and it will make sure that Passwords cannot be manipulated\/read without the key. Connections can be managed in the airflow by the User interface(Menu –> Admin –> Connections) or by the command line.<\/p>\n\n\n\n

\"Airflow<\/figure>\n\n\n\n

We can add, delete, and export the connections with the connections command<\/strong>. Let’s run the below command and verify what all facilities connection command<\/strong> provides:<\/p>\n\n\n\n

airflow connections -h\nusage: airflow connections [-h] COMMAND ...\n\nManage connections\n\npositional arguments:\n  COMMAND\n    add       Add a connection\n    delete    Delete a connection\n    export    Export all connections\n    get       Get a connection\n    import    Import connections from a file\n    list      List connections\n\noptional arguments:\n  -h, --help  show this help message and exit<\/pre>\n\n\n\n

Airflow List connections<\/h3>\n\n\n\n

use the connections list<\/strong> command to check all connections present in airflow.<\/p>\n\n\n\n

airflow connections list\nid | conn_id    | conn_type | description | host | schema | login | password                              | port | is_encrypted | is_extra_encrypted | extra_dejson | get_uri\n===+============+===========+=============+======+========+=======+=======================================+======+==============+====================+==============+=======================================\n1  | slack_conn | http      |             |      |        |       | https:\/\/hooks.slack.com\/services\/T023 | None | False        | False              | {}           | http:\/\/:https%3A%2F%2Fhooks.slack.com%<\/pre>\n\n\n\n

As you can see, we have a slack_conn present in airflow, which I have created from the UI. Let’s try to create another connection using the command line.<\/p>\n\n\n\n

Create an airflow connection<\/h3>\n\n\n\n

Use add test_connection command to create a new connection. Please provide the connection type, connection login, and connection password.<\/p>\n\n\n\n

airflow connections add test_connection --conn-type=http --conn-login=test --conn-password=test\n[2021-05-29 13:34:18,319] {crypto.py:82} WARNING - empty cryptography key - values will not be stored encrypted.\nSuccessfully added `conn_id`=test_connection : http:\/\/test:******@:<\/pre>\n\n\n\n

Now let’s verify if a new connection gets created.<\/p>\n\n\n\n

airflow connections list\nid | conn_id         | conn_type | description | host | schema | login | password                            | port | is_encrypted | is_extra_encrypted | extra_dejson | get_uri\n===+=================+===========+=============+======+========+=======+=====================================+======+==============+====================+==============+====================================\n1  | slack_conn      | http      |             |      |        |       | https:\/\/hooks.slack.com\/services\/T0 | None | False        | False              | {}           | http:\/\/:https%3A%2F%2Fhooks.slack.c          \n2  | test_connection | http      | None        | None | None   | test  | test                                | None | False        | False              | {}           | http:\/\/test:test@\n<\/pre>\n\n\n\n

Awesome, Now let’s try to delete the connection from the command line.<\/p>\n\n\n\n

Delete a connection in the airflow<\/h3>\n\n\n\n

The connections delete command<\/strong> lets you delete a connection from the airflow. Before running this command, ensure that you are not using this connection anywhere in your DAG; otherwise, particular airflow jobs will fail.<\/p>\n\n\n\n

airflow connections delete test_connection\nSuccessfully deleted connection with `conn_id`=test_connection<\/pre>\n\n\n\n

Let’s proceed further and check another remarkable airflow command with is dags<\/strong>.<\/p>\n\n\n\n

Airflow DAGs command<\/h2>\n\n\n\n

As per airflow’s official document<\/a>, a DAG<\/code>, A Directed Acyclic Graph is a collection of all the tasks you want to run, organized in a way that reflects their relationships and dependencies.<\/p>\n\n\n\n

You can find all the DAG under the DAGs tab.<\/p>\n\n\n\n

\"Airflow<\/figure>\n\n\n\n

The DAGs tab will display the active and paused dags and all DAGs present in airflow. We can get the exact details from the airflow command line using the dags command<\/strong>.<\/p>\n\n\n\n

Let’s run the dags help command and check what all option airflow dags command line<\/strong> provides:<\/p>\n\n\n\n

airflow dags -h\nusage: airflow dags [-h] COMMAND ...\n\nManage DAGs\n\npositional arguments:\n  COMMAND\n    backfill      Run subsections of a DAG for a specified date range\n    delete        Delete all DB records related to the specified DAG\n    list          List all the DAGs\n    list-jobs     List the jobs\n    list-runs     List DAG runs given a DAG id\n    next-execution\n                  Get the next execution datetimes of a DAG\n    pause         Pause a DAG\n    report        Show DagBag loading report\n    show          Displays DAG's tasks with their dependencies\n    state         Get the status of a dag run\n    test          Execute one single DagRun\n    trigger       Trigger a DAG run\n    unpause       Resume a paused DAG\n\noptional arguments:\n  -h, --help      show this help message and exit<\/pre>\n\n\n\n

We can list all dags, delete a dag, list jobs inside dags, etc. Let’s try some of the most common DAG commands.<\/p>\n\n\n\n

Airflow list all DAG<\/h3>\n\n\n\n

The dags list command<\/strong> lists all the DAG in airflow. It will show you the DAG owner and the status where the job is paused or active.<\/p>\n\n\n\n

airflow dags list\ndag_id                                  | filepath                                                                                                         | owner   | paused\n========================================+==================================================================================================================+=========+=======\nairflow_slack_notification_tutorial     | test_slack_alert.py                                                                                              | airflow | False\nemail_tutorial                          | testemail.py                                                                                                     | airflow | True\nexample_bash_operator                   | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_bash_operator.py                   | airflow | True\nexample_branch_datetime_operator_2      | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_branch_datetime_operator.py        | airflow | True\nexample_branch_dop_operator_v3          | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_branch_python_dop_operator_3.py    | airflow | True\nexample_branch_labels                   | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_branch_labels.py                   | airflow | True\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\ntutorial_taskflow_api_etl_virtualenv    | \/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/tutorial_taskflow_api_etl_virtualenv.py    | airflow | True<\/pre>\n\n\n\n

Airflow list-jobs command<\/h3>\n\n\n\n

dags list-jobs<\/strong> List the jobs inside a DAG. Let’s run this command and verify the output.<\/p>\n\n\n\n

airflow dags list-jobs -d airflow_slack_notification_tutorial\ndag_id                              | state   | job_type     | start_date                       | end_date\n====================================+=========+==============+==================================+=================================\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:22:22.697377+00:00 | 2021-05-28 14:22:23.924578+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:22:19.270604+00:00 | 2021-05-28 14:22:22.514622+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:15.783816+00:00 | 2021-05-28 14:17:17.621517+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:12.587375+00:00 | 2021-05-28 14:17:15.417111+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:11.581316+00:00 | 2021-05-28 14:17:14.024990+00:00\nairflow_slack_notification_tutorial | success | LocalTaskJob | 2021-05-28 14:17:06.681462+00:00 | 2021-05-28 14:17:10.110889+00:00<\/pre>\n\n\n\n

Airflow list runs command<\/h3>\n\n\n\n

dags list-runs command<\/strong>\u00a0takes the DAG id as input and lists DAG runs of a given DAG id. If the user provides the state option and the dags list-runs command, it will only search for all the dag runs with the given state.<\/p>\n\n\n\n

airflow dags list-runs -d airflow_slack_notification_tutorial\ndag_id                              | run_id                                      | state  | execution_date                   | start_date                       | end_date\n====================================+=============================================+========+==================================+==================================+=================================\nairflow_slack_notification_tutorial | manual__2021-05-28T14:14:19.651888+00:00    | pass | 2021-05-28T14:14:19.651888+00:00 | 2021-05-28T14:14:19.697786+00:00 | 2021-05-28T14:22:24.931743+00:00\nairflow_slack_notification_tutorial | scheduled__2021-05-27T14:16:56.676522+00:00 | pass | 2021-05-27T14:16:56.676522+00:00 | 2021-05-28T14:16:58.075571+00:00 | 2021-05-28T14:22:23.482804+00:00<\/pre>\n\n\n\n

Airflow next Dag execution command<\/h3>\n\n\n\n

The dags next-execution command<\/strong> displays the next execution datetimes of a DAG. If you wish to get more than 1 execution datetimes pass -n parameter.<\/p>\n\n\n\n

airflow dags next-execution email_tutorial -n 2\n2021-05-29 13:57:56.965112+00:00\n2021-05-30 13:57:56.965112+00:00<\/pre>\n\n\n\n

Now let’s run the subsequent execution in a stopped DAg and verify the output.<\/p>\n\n\n\n

airflow dags next-execution email_tutorial -n 1\n[INFO] Please be reminded this DAG is PAUSED now.\n2021-05-24 10:21:15.464155+00:00<\/pre>\n\n\n\n

As you can see, we got the next execution information, but we got an INFO.<\/p>\n\n\n\n

Airflow DAG report command<\/h3>\n\n\n\n

The dag report command<\/strong> will show the dag loading report. This command shows you helpful information like the File location or directory to look for the dag.<\/p>\n\n\n\n

airflow dags report\nfile                                                                              | duration       | dag_num | task_num | dags\n==================================================================================+================+=========+==========+===================================================================================\n\/test_slack_alert.py                                                              | 0:00:00.041452 | 1       | 2        | airflow_slack_notification_tutorial\n\/testemail.py                                                                     | 0:00:00.024659 | 1       | 1        | email_tutorial\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/example_sub | 0:00:00.017983 | 3       | 15       | example_subdag_operator,example_subdag_operator.section-1,example_subdag_operator.\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\nnch_labels.py                                                                     |                |         |          |\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/test_utils. | 0:00:00.002076 | 1       | 1        | test_utils\npy                                                                                |                |         |          |\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/tutorial_et | 0:00:00.001613 | 1       | 3        | tutorial_etl_dag\nl_dag.py                                                                          |                |         |          |\n\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/example_dags\/subdags\/sub | 0:00:00.001112 | 0       | 0        |\ndag.py                                                                            |                |         |          |\n<\/pre>\n\n\n\n

Airflow DAG show command<\/h3>\n\n\n\n

The Dags show command<\/strong> displays the complete DAG information and its dependencies.<\/p>\n\n\n\n

airflow dags show airflow_slack_notification_tutorial\ndigraph airflow_slack_notification_tutorial {\n\tgraph [label=airflow_slack_notification_tutorial labelloc=t rankdir=LR]\n\tsimple_bash_task [color=\"#000000\" fillcolor=\"#f0ede4\" label=simple_bash_task shape=rectangle style=\"filled,rounded\"]\n\tslack_notification [color=\"#000000\" fillcolor=\"#f4a460\" label=slack_notification shape=rectangle style=\"filled,rounded\"]\n\tsimple_bash_task -> slack_notification\n}<\/pre>\n\n\n\n

Test run DAG in airflow<\/h3>\n\n\n\n

Testing a DAG before running is a quick method to determine whether DAG is working as expected. We can test the airflow DAG by running the dags test<\/strong>.<\/p>\n\n\n\n

Let’s test the below .py file with the command line.<\/p>\n\n\n\n

from datetime import timedelta\n\nfrom airflow import DAG\nfrom airflow.operators.bash import BashOperator\nfrom airflow.operators.dummy import DummyOperator\nfrom airflow.utils.dates import days_ago\n\nargs = {\n    'owner': 'airflow',\n}\n\nwith DAG(\n    dag_id='example_bash_operator',\n    default_args=args,\n    schedule_interval='0 0 * * *',\n    start_date=days_ago(2),\n    dagrun_timeout=timedelta(minutes=60),\n    tags=['example', 'example2'],\n    params={\"example_key\": \"example_value\"},\n) as dag:\n\n    run_this_last = DummyOperator(\n        task_id='run_this_last',\n    )\n\n    # [START howto_operator_bash]\n    run_this = BashOperator(\n        task_id='run_after_loop',\n        bash_command='echo 1',\n    )\n    # [END howto_operator_bash]\n\n    run_this >> run_this_last\n\n    for i in range(3):\n        task = BashOperator(\n            task_id='runme_' + str(i),\n            bash_command='echo \"{{ task_instance_key_str }}\" && sleep 1',\n        )\n        task >> run_this\n\n    # [START howto_operator_bash_template]\n    also_run_this = BashOperator(\n        task_id='also_run_this',\n        bash_command='echo \"run_id={{ run_id }} | dag_run={{ dag_run }}\"',\n    )\n    # [END howto_operator_bash_template]\n    also_run_this >> run_this_last\n\n# [START howto_operator_bash_skip]\nthis_will_skip = BashOperator(\n    task_id='this_will_skip',\n    bash_command='echo \"hello world\"; exit 99;',\n    dag=dag,\n)\n# [END howto_operator_bash_skip]\nthis_will_skip >> run_this_last\n\nif __name__ == \"__main__\":\n    dag.cli()<\/pre>\n\n\n\n

Create a DAG by copy-pasting the below code in a .py file and running the dags test <\/strong>command<\/p>\n\n\n\n

airflow dags test example_bash_operator 2021-01-01\n[2021-05-29 17:48:53,823] {dagbag.py:487} INFO - Filling up the DagBag from \/opt\/airflow\/dags\n[2021-05-29 17:48:54,494] {base_executor.py:82} INFO - Adding to queue: ['<TaskInstance: example_bash_operator.runme_0 2021-01-01 00:00:00+00:00 [queued]>']\n[2021-05-29 17:48:54,555] {base_executor.py:82} INFO - Adding to queue: ['<TaskInstance: example_bash_operator.runme_1 2021-01-01 00:00:00+00:00 [queued]>']\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\n20210529T174904\n[2021-05-29 17:49:04,574] {taskinstance.py:1245} INFO - 0 downstream tasks scheduled from follow-on schedule check\n[2021-05-29 17:49:04,617] {dagrun.py:444} INFO - Marking run <DagRun example_bash_operator @ 2021-01-01 00:00:00+00:00: backfill__2021-01-01T00:00:00+00:00, externally triggered: False> successful\n[2021-05-29 17:49:04,631] {backfill_job.py:388} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 1 | succeeded: 5 | running: 0 | failed: 0 | skipped: 1 | deadlocked: 0 | not ready: 1\n[2021-05-29 17:49:09,414] {backfill_job.py:388} INFO - [backfill progress] | finished run 1 of 1 | tasks waiting: 0 | succeeded: 5 | running: 0 | failed: 0 | skipped: 2 | deadlocked: 0 | not ready: 0\n[2021-05-29 17:49:09,427] {backfill_job.py:831} INFO - Backfill done. Exiting.<\/pre>\n\n\n\n

Here the DAG run passed without any issues.<\/p>\n\n\n\n

Airflow delete a DAG<\/h3>\n\n\n\n

Sometimes we have a delete a DAG that is no longer required. We can delete a DAG by the dags delete<\/strong> command. Let’s try to delete a DAG.<\/p>\n\n\n\n

airflow dags delete test_utils -y\n[2021-05-29 13:46:32,100] {__init__.py:38} INFO - Loaded API auth backend: <module 'airflow.api.auth.backend.basic_auth' from '\/home\/airflow\/.local\/lib\/python3.6\/site-packages\/airflow\/api\/auth\/backend\/basic_auth.py'>\n[2021-05-29 13:46:32,131] {delete_dag.py:42} INFO - Deleting DAG: test_utils\nRemoved 2 record(s)<\/pre>\n\n\n\n

Pass -y=This will drop all existing records related to the specified DAG<\/p>\n\n\n\n

Airflow Tasks command<\/h2>\n\n\n\n

tasks command<\/strong> helps us manage tasks. With the tasks command, we can run a task, test a task, check the task’s status, and perform many more operations.<\/p>\n\n\n\n

default@a4bd0ae3c9a0:\/opt\/airflow$ airflow tasks -h\nusage: airflow tasks [-h] COMMAND ...\n\nManage tasks\n\npositional arguments:\n  COMMAND\n    clear             Clear a set of task instance, as if they never ran\n    failed-deps       Returns the unmet dependencies for a task instance\n    list              List the tasks within a DAG\n    render            Render a task instance's template(s)\n    run               Run a single task instance\n    state             Get the status of a task instance\n    states-for-dag-run\n                      Get the status of all task instances in a dag run\n    test              Test a task instance\n\noptional arguments:\n  -h, --help          show this help message and exit<\/pre>\n\n\n\n

Let’s try a few of the task commands.<\/p>\n\n\n\n

Airflow list all task within DAG<\/h3>\n\n\n\n

The tasks list command<\/strong> takes the DAG name as a parameter and lists all the tasks present in the DAG.<\/p>\n\n\n\n

airflow tasks list example_bash_operator\nalso_run_this\nrun_after_loop\nrun_this_last\nrunme_0\nrunme_1\nrunme_2\nthis_will_skip<\/pre>\n\n\n\n

Airflow Task run<\/h3>\n\n\n\n

The task run command<\/strong> helps us to run any task present in DAG. The run command takes the below arguments.<\/p>\n\n\n\n

positional arguments:\n  dag_id                The id of the dag\n  task_id               The id of the task\n  execution_date        The execution date of the DAG<\/code><\/pre>\n\n\n\n

Let’s run a simple task using the command line<\/p>\n\n\n\n

airflow tasks run example_bash_operator runme_0 2021-01-01\n[2021-05-29 15:59:22,239] {dagbag.py:487} INFO - Filling up the DagBag from \/opt\/***\/dags\n.......<suppressed o\/p>...........\n.......<suppressed o\/p>...........\nRunning <TaskInstance: example_bash_operator.runme_0 2021-01-01T00:00:00+00:00 [success]> on host a4bd0ae3c9a0<\/pre>\n\n\n\n

Airflow check the status of a task<\/h3>\n\n\n\n

with the help of the tasks state command; we<\/strong> can check the status of a particular task.<\/p>\n\n\n\n

airflow tasks state example_bash_operator runme_0 2021-01-01\nsuccess<\/pre>\n\n\n\n

Airflow database check command<\/h2>\n\n\n\n

Airflow depends on a database to save its metadata. We can quickly check if the database is reachable with the DB check command.<\/p>\n\n\n\n

airflow db check\n[2021-05-29 15:43:02,284] {db.py:776} INFO - Connection successful.<\/pre>\n\n\n\n

Airflow Jobs command<\/h2>\n\n\n\n

With job command<\/strong>, we can easily manage jobs in airflow. Let’s run this command and verify if there are any active jobs.<\/p>\n\n\n\n

airflow jobs check\nFound one alive job.<\/pre>\n\n\n\n

Roles command in airflow<\/h2>\n\n\n\n

With the help of the roles command<\/strong>, we can easily create and list roles in airflow. Let’s see that role command in action.<\/p>\n\n\n\n

Airflow list roles<\/h3>\n\n\n\n

roles list<\/strong> <\/strong>command lists all the roles available in the airflow instance.<\/p>\n\n\n\n

airflow roles list\nname\n======\nAdmin\nOp\nPublic\nUser\nViewer<\/pre>\n\n\n\n

Users command in airflow<\/h2>\n\n\n\n

Users is another handy command which enables us to Manage users. Let’s see a few users’ commands in action.<\/p>\n\n\n\n

Airflow users list command <\/h3>\n\n\n\n

The user’s list command lists all the users.<\/p>\n\n\n\n

airflow users list\nid | username | email                    | first_name | last_name | roles\n===+==========+==========================+============+===========+======\n1  | airflow  | airflowadmin@example.com | Airflow    | Admin     | Admin<\/pre>\n\n\n\n

Airflow create a user<\/h3>\n\n\n\n

We can also create a user using the user create command.<\/p>\n\n\n\n


Let’s see what the required parameters are:<\/p>\n\n\n\n

airflow users create -h\nusage: airflow users create [-h] -e EMAIL -f FIRSTNAME -l LASTNAME\n                            [-p PASSWORD] -r ROLE [--use-random-password] -u\n                            USERNAME\n\nCreate a user\n\noptional arguments:\n  -h, --help            show this help message and exit\n  -e EMAIL, --email EMAIL\n                        Email of the user\n  -f FIRSTNAME, --firstname FIRSTNAME\n                        First name of the user\n  -l LASTNAME, --lastname LASTNAME\n                        Last name of the user\n  -p PASSWORD, --password PASSWORD\n                        Password of the user, required to create a user without --use-random-password\n  -r ROLE, --role ROLE  Role of the user. Existing roles include Admin, User, Op, Viewer, and Public\n  --use-random-password\n                        Do not prompt for password. Use random string instead. Required to create a user without --password\n  -u USERNAME, --username USERNAME\n                        Username of the user<\/pre>\n\n\n\n

Now create a test user using the users create command. If the password is not specified, the user will get prompted for a user password.<\/p>\n\n\n\n

airflow users create -r User -u test -e test@test.com -f test_first_name -l test_last_name -p test\nUser user test created\n\nairflow users list\nid | username | email                    | first_name      | last_name      | roles\n===+==========+==========================+=================+================+======\n1  | airflow  | airflowadmin@example.com | Airflow         | Admin          | Admin\n2  | test     | test@test.com            | test_first_name | test_last_name | User\n<\/pre>\n\n\n\n

Airflow Variables command<\/h2>\n\n\n\n

With the variable command<\/strong>, we can easily manage the variables. Let’s use a few variable commands.<\/p>\n\n\n\n

List variables in airflow<\/h3>\n\n\n\n

The variables list lists all the variable keys in airflow.<\/p>\n\n\n\n

airflow variables list\nkey\n====\ntest<\/pre>\n\n\n\n

if you wish to get a variable value, please use the below command<\/p>\n\n\n\n

airflow variables get test\ntest_value<\/pre>\n\n\n\n

You can check the user from the airflow user interface as well<\/p>\n\n\n\n

\"List<\/figure>\n\n\n\n

Airflow create a variable<\/h3>\n\n\n\n

with the variables set; we<\/strong> can easily create variables in airflow. The set takes the variable key and variable value as a parameter.<\/p>\n\n\n\n

airflow variables set variable_key variable_value\n[2021-05-29 16:15:48,093] {crypto.py:82} WARNING - empty cryptography key - values will not be stored encrypted.\nVariable variable_key created\n\nairflow variables list\nkey\n============\ntest\nvariable_key<\/pre>\n\n\n\n

Conclusion<\/h2>\n\n\n\n

Finally, we have come to an end to this trying tutorial. We started with the airflow setup command, and we learned how to manage DAGs and tasks in airflow. At last, we learned how to manage users, roles, and variables in airflow. I hope you have found this article useful. Please do let me know in the comment box if you face any issues with the above commands. Happy learning.<\/p>\n\n\n\n

More to Read?<\/h2>\n\n\n\n

How to send email from airflow<\/a><\/p>\n\n\n\n

How to integrate airflow with slack<\/a><\/p>\n\n\n\n

Install airflow using the docker<\/a><\/p>\n\n\n\n

<\/p>\n","protected":false},"excerpt":{"rendered":"

Apache airflow is an excellent open-source tool that lets you manage and run a complex data pipeline. Airflow has a straightforward user interface as well, using which we can easily …<\/p>\n

Airflow command line | Complete Airflow command in 2023<\/span> Read More \u00bb<\/a><\/p>\n","protected":false},"author":1,"featured_media":32594,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_uag_custom_page_level_css":"","site-sidebar-layout":"default","site-content-layout":"default","ast-site-content-layout":"","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"","ast-breadcrumbs-content":"","ast-featured-img":"","footer-sml-layout":"","theme-transparent-header-meta":"default","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":"","header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"","footnotes":""},"categories":[41,68],"tags":[42,72],"uagb_featured_image_src":{"full":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",1200,628,false],"thumbnail":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-150x150.webp",150,150,true],"medium":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-300x157.webp",300,157,true],"medium_large":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-768x402.webp",768,402,true],"large":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command-1024x536.webp",1024,536,true],"1536x1536":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",1200,628,false],"2048x2048":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",1200,628,false],"web-stories-poster-portrait":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",640,335,false],"web-stories-publisher-logo":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",96,50,false],"web-stories-thumbnail":["https:\/\/naiveskill.com\/wp-content\/uploads\/2021\/05\/airflow_command.webp",150,79,false]},"uagb_author_info":{"display_name":"naivetechblog@gmail.com","author_link":"https:\/\/naiveskill.com\/author\/naivetechbloggmail-com\/"},"uagb_comment_info":2,"uagb_excerpt":"Apache airflow is an excellent open-source tool that lets you manage and run a complex data pipeline. Airflow has a straightforward user interface as well, using which we can easily … Airflow command line | Complete Airflow command in 2023 Read More \u00bb","_links":{"self":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts\/25940"}],"collection":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/comments?post=25940"}],"version-history":[{"count":1,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts\/25940\/revisions"}],"predecessor-version":[{"id":36791,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/posts\/25940\/revisions\/36791"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/media\/32594"}],"wp:attachment":[{"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/media?parent=25940"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/categories?post=25940"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/naiveskill.com\/wp-json\/wp\/v2\/tags?post=25940"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}