Managing DAG on Airflow Cluster¶
Managing DAG Explorer Permissions¶
To use the Dag explorer on the Airflow cluster, you require the following permissions:
- View Files: Provide Read access on the Object Storage and Airflow Cluster.
- Download Files: Provide Download access on the Object Storage and Read access on the Airflow cluster.
- Upload Files: Provide Upload access on the Object Storage and Read access on the Airflow cluster. In the case of
1.10.2.QDS, you must need the Cluster Admin access by updating permission on the cluster.
- Delete Files: Provide Delete access on the Object Storage and Update Permission on the Airflow cluster.
Uploading and downloading DAGs are not available on Oracle cloud.
Uploading and Downloading a DAG¶
You can now upload and download Airflow python DAG files to the account’s default storage location, edit them in place, and sync them with Airflow clusters periodically (in the background) from the Airflow cluster page. The files immediately sync (automatically) with the new cluster. However, a cluster restart is required in the existing clusters. Otherwise, the files sync with the clusters within 5 minutes.
Perform the following steps to upload a DAG:
- Navigate to the Clusters page and click the Airflow cluster that you want to work with.
- Click Dag Explorer from the left pane.
The list of dag_logs, dags, plugins, and process_logs appear.
To upload the files in your S3 bucket using QDS, you need to configure CORS policy. For more information on CORS policy configuration, see Upload a File to Amazon S3 Bucket.
3. Click the link against the dags folder and select the file you want to upload. Once the upload is complete, you can view the file under the dags folder.
- Verify the File Path and the dag contents in the right pane and click Save.
To download any file from the dags folder, click the link of the corresponding file.
Configuring DAG Explorer Sync Location¶
You can configure the Remote Sync Location for DAG Explorer (AWS) following the instructions below:
- Log in to the QDS and navigate to the Clusters page.
- Select an Airflow cluster from the list of clusters.
- Select Dag Explorer tab from the left pane.
Click the settings icon and the Sync Settings window appears. You can view the default S3 location in the S3 Location field.
Enter the new sync location in the S3 Location field and click Update and Push. RemoteSync Location is updated successfully.
The entered S3 location must have two directories named DAGs and plugins so that the Airflow cluster can sync DAGs and plugins from the respective folders.