Submit a Shell Command

POST /api/v1.2/commands/

Use this API to submit a shell command and it only supports Bash commands.

Required Role

The following users can make this API call:

  • Users who belong to the system-user or system-admin group.

  • Users who belong to a group associated with a role that allows submitting a command. See Managing Groups and Managing Roles for more information.

Parameters

Note

Parameters marked in bold below are mandatory. Others are optional and have default values.

Parameter

Description

inline

Inline script to run or submit a Shell command. Either script or script_location is required.

script_location

Specify a S3 path where the shell query to run is stored. Either query or script_location is required. AWS storage credentials stored in the account are used.

command_type

ShellCommand

files

List of files in an S3 bucket. Format : file1,file2. These files will be copied to the working directory where the command is executed.

archive

List of archives in an S3 bucket. Format : archive1,archive2. These are unarchived in the working directory where the command is executed. The file gets archived in the folder with the name as the filename along with its extension. For example, if the archive file is s3://<bucket>/abc.tar, it uncompresses the file as <working directory>/abc.tar. If you have to refer to a file src/a.py, then refer to it as abc.tar/src/a.py.

macros

Expressions to evaluate macros used in the shell command. Refer to Macros in Scheduler for more details.

label

Specify the cluster label on which this command is to be run.

retry

Denotes the number of retries for a job. Valid values of retry are 1, 2, and 3.

retry_delay

Denotes the time interval between the retries when a job fails. The unit of measurement is minutes.

can_notify

Sends an email on command completion.

name

Add a name to the command that is useful while filtering commands from the command history. It does not accept & (ampersand), < (lesser than), > (greater than), “ (double quotes), and ‘ (single quote) special characters, and HTML tags as well. It can contain a maximum of 255 characters.

pool

Use this parameter to specify the Fairscheduler pool name for the command to use.

tags

Add a tag to a command so that it is easily identifiable and searchable from the commands list in the Commands History. Add a tag as a filter value while searching commands. It can contain a maximum of 255 characters. A comma-separated list of tags can be associated with a single command. While adding a tag value, enclose it in square brackets. For example, {"tags":["<tag-value>"]}.

macros

Denotes the macros that are valid assignment statements containing the variables and its expression as: macros: [{"<variable>":<variable-expression>}, {..}]. You can add more than one variable. For more information, see Macros.

timeout

It is a timeout for command execution that you can set in seconds. Its default value is 129600 seconds (36 hours). QDS checks the timeout for a command every 60 seconds. If the timeout is set for 80 seconds, the command gets killed in the next minute that is after 120 seconds. By setting this parameter, you can avoid the command from running for 36 hours.

Examples

Goal: Inline script

curl  -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{
        "inline":"hadoop dfs -lsr s3://paid-qubole/;", "command_type":"ShellCommand"
        }' \
"https://api.qubole.com/api/v1.2/commands"

Note

The above syntax uses https://api.qubole.com as the endpoint. Qubole provides other endpoints to access QDS that are described in Supported Qubole Endpoints on Different Cloud Providers.

Response:

HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
{
    "qlog":null,
     "created_at":"2015-01-12T11:50:21Z",
     "status":"waiting",
     "meta_data":{
         "results_resource":"commands/36/results",
         "logs_resource":"commands/36/logs"
     },
     "account_id":"1",
     "user_id":1,
     "pool":null,
     "submit_time":1421063421,
     "progress":0,
     "template":"generic",
     "pid":null,
     "resolved_macros":null,
     "label":"default",
     "timeout":null,
     "can_notify":false,
     "qbol_session_id":7,
     "command_source":"API",
     "name":null,
     "num_result_dir":-1,
     "end_time":null,
     "start_time":null,
     "path":"/tmp/2015-01-12/1/36",
     "id":36,
     "command_type":"ShellCommand",
     "command":{
         "files":null,
         "parameters":null,
         "script_location":null,
         "inline":"hadoop dfs -lsr s3://paid-qubole/;",
         "archives":null
     }
 }

Goal: Script_location

curl  -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{
        "script_location":"s3://paid-qubole/ShellDemo/data/excite-small.sh;", "command_type":"ShellCommand"
        }' \
"https://api.qubole.com/api/v1.2/commands"

Note

The above syntax uses https://api.qubole.com as the endpoint. Qubole provides other endpoints to access QDS that are described in Supported Qubole Endpoints on Different Cloud Providers.

Goal: Running shell commands using Files

curl -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{
        "inline":"hadoop dfs -lsr s3://paid-qubole/;", "files":"s3://paid-qubole/ShellDemo/data/excite-small.sh,s3://paid-qubole/ShellDemo/data/excite-big.sh;", "command_type":"ShellCommand"
        }' \
"https://api.qubole.com/api/v1.2/commands"

Note

The above syntax uses https://api.qubole.com as the endpoint. Qubole provides other endpoints to access QDS that are described in Supported Qubole Endpoints on Different Cloud Providers.

Goal: Running shell commands using Archives

curl -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{
        "inline":"hadoop dfs -lsr s3://paid-qubole/;", "archives":"s3://paid-qubole/ShellDemo/data/excite-small.gz,s3://paid-qubole/ShellDemo/data/excite-big.gz;", "command_type":"ShellCommand"
        }' \
"https://api.qubole.com/api/v1.2/commands"

Note

The above syntax uses https://api.qubole.com as the endpoint. Qubole provides other endpoints to access QDS that are described in Supported Qubole Endpoints on Different Cloud Providers.

Goal: Using Macros in a shell command

curl -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{
        "inline" : "hadoop dfs -lsr s3://$location$/;", "command_type" : "ShellCommand",
        "macros" : [{"location" : "\"paid-qubole\""}]}' \
"https://api.qubole.com/api/v1.2/commands"

Take a note of how the double quotes are used in the above query.

Note

The above syntax uses https://api.qubole.com as the endpoint. Qubole provides other endpoints to access QDS that are described in Supported Qubole Endpoints on Different Cloud Providers.

Goal: Submit a Shell Script

curl -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"parameters" : "5454 5454", "command_type" : "ShellCommand"}' \
 "https://api.qubole.com/api/v1.2/commands"

Note

The above syntax uses https://api.qubole.com as the endpoint. Qubole provides other endpoints to access QDS that are described in Supported Qubole Endpoints on Different Cloud Providers.