Create a Partition Sensor API

POST /api/v1.2/partition_sensor

Use this API to create a partition sensor API. Airflow uses it for programmatically monitoring workflows.

Required Role

The following users can make this API call:

  • Users who belong to the system-user or system-admin group.
  • Users who belong to a group associated with a role that allows creating a sensor. See Managing Groups and Managing Roles for more information.



Parameters marked in bold below are mandatory. Others are optional and have default values.

Parameter Description
schema The database that contains the Hive table partition, which needs monitoring.
table The name of the Hive table that contains the partition, which needs monitoring.
columns This contains the array of Hive-table-column names and the corresponding values.

Request API Syntax

curl -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"schema": "<Database Name>", "table":"<Hive table name>",
     "columns":[{"column":"<column name>", "values":["<value>"]}]}' \


The above syntax uses as the endpoint. Qubole provides other endpoints to access QDS that are described in Supported Qubole Endpoints on Different Cloud Providers.

Sample API Request

Here is an example of creating an Hive table partition sensor.

curl -X POST -H "X-AUTH-TOKEN: $AUTH_TOKEN" -H "Content-Type: application/json" -H "Accept: application/json" \
-d '{"schema": "default", "table":"hivetable", "columns":[{"column":"dt", "values":"[2017-05-19]"}]}' \

The requests with 200 response code will have just a status field which will either contain true or false. The requets with 422 response code will contain the error message as well.

{"status": "true"}
{"error": {"error_code": 422, "error_message": "Table can't be found in metastore"}}