Import a Notebook on Microsoft Azure

POST /api/v1.2/notebooks/import

Use this API to import a Spark notebook from a location and add it to the notebooks list in the QDS account.

Required Role

The following roles can make this API call:

  • A user who is part of the system-user/system-admin group.

  • A user invoking this API must be part of a group associated with a role that allows submitting a command. See Managing Groups and Managing Roles for more information.

Parameters

Note

Parameters marked in bold below are mandatory. Others are optional and have default values. Presto is not currently supported on all Cloud platforms; see QDS Components: Supported Versions and Cloud Platforms.

Parameter

Description

name

It is the name of the notebook. It is a string and can accept alpha-numerical characters.

location

It is the location of the folder. By default, it goes to Users/current_user_email_id folders. For more information on notebook folders, see Using Folders in Notebooks. The accepted folder locations are: Users/current_user_email_id, Common, and Users. The default location is Users/current_user_email_id and it is equivalent to My Home on the Notebooks UI. You need privileges to create/edit notebooks in Common and Users. For more information, see Managing Folder-level Permissions.

note_type

It is the type of notebook. The values are spark and presto.

file

It is a parameter that you can only specify to import a notebook from a location on the local hard disk. For adding the complete location path, you must start it with @/. For example, "file":"@/home/spark....

nbaddmode

It must only be used with the file parameter. Its value is import-from-computer.

url

It is the BLOB storage location, a valid JSON URL, or an ipynb URL of the notebook that you want to import.

Request API Syntax

Syntax to use for importing a notebook that is on Azure cloud storage.

curl   -X "POST" -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"name":"<Name>", "location": "<Location>", "note_type": "<Note Type>", "url":"<BLOB storage location in the URL format/valid-JSON-URL/ipynb-URL>" }' \
"https://azure.qubole.com/api/v1.2/notebooks/import"

Syntax to use for importing a notebook that is on a local hard disk.

Note

For adding the file location in the local hard disk, you must start it with @/. For example, "file":"@/home/spark....

curl   -X "POST" -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-F "name":"<Name>" -F  "location":"<Location>" -F "note_type":"<Note Type>" -F  "file":"<local hard disk location>" -F  "nbaddmode":"import-from-computer" \
"https://azure.qubole.com/api/v1.2/notebooks/import"

Sample API Request

Here is an example to import a notebook from Github.

curl -X "POST" -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json"
-d '{"name":"SparkNote", "location":"Users/[email protected]", "note_type":"spark",
     "url":"https://raw.githubusercontent.com/lightning-viz/lightning-example-notebooks/master/plots/scatter.ipynb"}'
"https://azure.qubole.com/api/v1.2/notebooks/import"

Here is an example to import a Spark notebook from the local hard disk.

curl -X "POST" -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json"
-F "name":"SparkNote1" -F  "location":"Users/[email protected]" -F "note_type":"spark" -F  "file":"@/home/spark/SparkNoteb.ipynb"
-F  "nbaddmode":"import-from-computer"
"https://azure.qubole.com/api/v1.2/notebooks/import"