Qubole Data Service
Getting Started
Release Information
How-To (QDS Guides, Tasks, and FAQs)
User Guide
Qubole Product Suite
Administration Guide
QDS Alerts and Runbooks
FAQs
New User FAQs
General Questions
Questions about Airflow
1. Do I need to provide access to Qubole while registering Airflow datastore in QDS?
2. What does this error -
Data store was created successfully but it could not be activated
mean?
3. How do I put the AUTH_TOKEN into the Qubole Default connection?
4. How are API token and custom default data store related on Airflow Clusters?
5. Is there any button to run a DAG on Airflow?
6. Can I create a configuration to externally trigger an Airflow DAG?
7. Why must I reenter the database password/AUTH-token at a cluster restart?
8. Questions on Airflow Service Issues
9. Deleting a DAG on an Airflow Cluster
Questions about Hive
Questions about QDS Clusters
Questions about AWS
Questions about Azure
Questions about Security
Questions about Package Management
Qubole Product Suite
Connectivity Options
REST API Reference
Troubleshooting Guide
How QDS Works
Product Security Guide
Additional Resources
Providing Feedback
Qubole Data Service
How-To (QDS Guides, Tasks, and FAQs)
FAQs
Questions about Airflow
1.
Do I need to provide access to Qubole while registering Airflow datastore in QDS?
1.
Do I need to provide access to Qubole while registering Airflow datastore in QDS?
No, QDS does not need access to the Airflow datastore.
Feedback
| Try Free Trial