Known Issues and Limitations
This topic describes the known issues and limitations related to the Qubole-Snowflake integration.
Creating a Snowflake data store might fail with an error.
When you create a Snowflake data store in QDS, it might fail with the following error:
`Cannot perform SELECT. This session does not have a current database. Call ‘USE DATABASE’, or use a qualified name.`
This error occurs because the user default role might not have access to the database specified in the configuration. If the snowflake user does not have a “default_role” assigned, and the role is not specified in the connection string, then the PUBLIC role is used, which might not have access to the database. Currently, creating data store in QDS cannot specify the user roles in configurations. Therefore, you must ensure that the default role associated with the credential should have access privilege to the database.
Canceling of PySpark paragraphs for Snowflake query in Notebooks does not cancel the corresponding Snowflake query in the Snowflake UI.
You must manually cancel the corresponding Snowflake query on the Snowflake UI.
Canceling of Scala paragraphs for Snowflake query in Notebooks does not cancel the query.
When you click the Cancel button to cancel a Scala paragraph for Snowflake query in Notebooks, execution of the query is not cancelled. You must click the Cancel button twice to cancel the Scala paragraph.
Query with hyphens in database names fails.
If the database name contains hyphen (-), you must use additional escape characters (
"\"database-name\""
) for the database name when composing the Spark command or Spark application.
Query with column names containing whitespace or other non-standard characters fails.
You must use additional escape characters (
"\"column name\""
) for the column name in the query.
DDL and DML queries are not supported by the Qubole Dataframe API for Apache Spark.
You can use the
`runQuery()`
method of the`Utils`
object. See Snowflake documentation: Executing DDL/DML SQL Statements
The parallelism functionality is not supported with Snowflake data stores.
Snowflake does not support the parallelism functionality. As a result, parallelism does not work when importing data from the Snowflake data store either by using the command composer on the Analyze page or by using the DB Import command.
For more information about using the Snowflake Spark connector, see Snowflake documentation: Using the Spark Connector