Composing a Data Export Command through the UI
Use the command composer on the Analyze page to compose a data export command. You can export Hive tables and directories from Cloud storage. See Data Export for more information.
Prerequisites
You must have an existing data store to which the data is to be exported. Create a data store in Explore if it does not exist.
Note
Hadoop 2, Presto, and Spark clusters support database commands; these commands can also be run without bringing up a cluster. See Mapping of Cluster and Command Types for more information.
Composing a Hive Table Export Command
Note
Using the Supported Keyboard Shortcuts in Analyze describes the supported keyboard shortcuts.
Perform the following steps to compose a command to export a Hive table:
Navigate to the Analyze page and choose Compose.
Choose Data Export from the Command Type drop-down list. The Mode defaults to HiveTableExport, which is what you want.
Specify the Hive table in the HiveTable text field.
In the Hive Table Partition Spec text field, select HiveTable partition spec for Data Import to import data into a partition. Make sure that you provide name and value of all the partitions in following form:
dt=20130101/country=EU
.Choose a data store from the Data Store drop-down list.
Choose a table from the DbTable drop-down list.
Choose a DB Update Mode. Append Mode is the default. The other two options are Update Only Mode and Insert and Update Mode (supported only for an Oracle MySQL database).
If you want to run the command on a Hadoop2 or Spark cluster, select the Use Hadoop2/Spark Cluster check box and choose the cluster label from the drop-down list.
Click Run to execute the command. Click Save if you want to re-run the same command later (see Workspace for more information on saving commands and queries).
You can see the result under the Results tab and the logs under the Logs tab. The Logs tab has the Errors and Warnings filter. For more information on how to download command results and logs, see Downloading Results and Logs.
Composing a Directory Export Command
Perform the following steps to export a directory from a Cloud storage bucket to a data store:
Navigate to the Analyze page and choose Compose.
Choose Data Export from the Command Type drop-down list. Change the Mode to Directory Export.
In the Export Directory text field, specify the path of the Cloud storage directory.
If necessary, change the default value of the field separator in the Fields Terminated by text field.
Choose a data store from the Data Store drop-down list.
Choose a table from the DbTable drop-down list.
Choose a DB Update Mode. Append Mode is the default. The other two options are Update Only Mode and Insert and Update Mode (supported only for an Oracle MySQL database).
If you want to run the command on a Hadoop2 or Spark cluster, select the Use Hadoop2/Spark Cluster check box and choose the cluster label from the drop-down list.
Click Run to execute the command. Click Save if you want to re-run the same command later (see Workspace for more information on saving commands and queries).
You can see the command result under the Results tab and the command logs under the Logs tab. The Logs tab has the Errors and Warnings filter. For more information on how to download command results and logs, see Downloading Results and Logs.