Uploading and Downloading a DAG on an Airflow Cluster

Uploading and Downloading a DAG

You can now upload and download Airflow python DAG files to the account’s default storage location, edit them in place, and sync them with Airflow clusters periodically (in the background) from the Airflow cluster page. The files immediately sync (automatically) with the new cluster. However, a cluster restart is required in the existing clusters. Otherwise, the files sync with the clusters within 5 minutes.

Perform the following steps to upload a DAG:

  1. Navigate to the Clusters page and click the Airflow cluster that you want to work with.
  2. Click Dag Explorer from the left pane.
../../../_images/dag.png

The list of dag_logs, dags, plugins, and process_logs appear.

Note

To upload the files in your S3 bucket using QDS, you need to configure CORS policy. For more information on CORS policy configuration, see Uploading a File to Amazon S3 Buckets.

3. Click the uploaddag link against the dags folder and select the file you want to upload. Once the upload is complete, you can view the file under the dags folder.

  1. Verify the File Path and the dag contents in the right pane and click Save.
../../../_images/filepath.png

To download any file from the dags folder, click the downloaddag link of the corresponding file.

Managing Dag Explorer Permissions

To use the Dag explorer on an Airflow cluster, you require the following permissions:

  • View Files: Provide Read access on the Airflow Cluster.
  • Download Files: Provide Download access on the Object Storage and Read access on the Airflow cluster.
  • Upload Files: Provide Upload access on the Object Storage and Read access on the Airflow cluster. In the case of Airflow version 1.10.2.QDS, you must need the Cluster Admin access by updating permission on the cluster.

To know more about providing permissions on the Airflow cluster and Object Storage, see Managing Cluster Permissions through the UI and Managing Access Permissions and Roles respectively.