Performing Read and Write Operations on Snowflake Data Stores
After adding a Snowflake Data Store to QDS, you can read data from and write data to the Snowflake data store by using the Qubole Dataframe API for Apache Spark. You can also run DDL commands on a Snowflake data warehouse by using the new Qubole Spark Scala API.
Alternatively, you can also perform the following tasks on the Snowflake data store:
Query Snowflake data store by using the query composer available in the Analyze page. See Composing a DB Query.
Import data from the Snowflake data store either by using the command composer on the Analyze page or by using the DB Import command. See Composing a Data Import Command through the UI and Submit a DB Import Command.
Export data to the Snowflake data store either by using the command composer on the Analyze page or by using the DB Export command. See Composing a Data Export Command through the UI and Submit a DB Export Command.
The read and write operations using the Qubole Dataframe API for Apache Spark and running DDL commands are explained in the following topics: