Loading...

Getting started with REST APIs for Azure Synapse Analytics - Apache Spark Pool

Getting started with REST APIs for Azure Synapse Analytics - Apache Spark Pool

abidguroo_18-1667408356868.png

 

Author: Abid Nazir Guroo is a Program Manager in Azure Synapse Customer Success Engineering (CSE) team.

 

Introduction

Azure Synapse Analytics Representational State Transfer (REST) APIs are secure HTTP service endpoints that support creating and managing Azure Synapse resources using Azure Resource Manager and Azure Synapse web endpoints. This article provides instructions on how to setup and use Synapse REST endpoints and describe the Apache Spark Pool operations supported by REST APIs.

 

Authentication

In order to perform any operation using Azure REST APIs you need to authenticate the request using an azure active directory authentication token. This token can be generated by various interactive and non-interactive methods. In this tutorial we will use a azure active directory service principal for generating a token with below steps:

  1. Register an application with Azure AD and create a service principal. You can create a new service principal using Azure Portal by following the steps outlined in Create an Azure AD app and service principal
  2. Create a secret for the registered service principal and save it securely for usage. Instructions for creating a secret are documented at - Create a secret for an application
  3. Assign an appropriate role to the registered service principal on the Synapse Workspace based on the operations you want to perform using the REST APIs. You can find detailed information on all available Azure Synapse Analytics roles and corresponding permissions at - Azure Synapse RBAC roles

 

REST API Client

You can use various command line utilities like CURL or Powershell, programming languages or GUI clients to interact with REST API endpoints. In this article we will leverage a widely used GUI application - Postman for its simplicity and ease of setup. Postman can be downloaded with this link.

 

Example: How to get an Azure AD Access Token

The following parameters are needed to successfully get an Azure AD access token:

 

Parameter

Description

Tenant ID

The Azure Active Directory (tenant) ID where the application is registered.

Client ID

The Application (client) ID for the application registered in Azure AD.

Client secret

The Value of the client secret for the application registered in Azure AD.

 

On the Postman UI

  1. Create a new HTTP request (File > New > HTTP Request or using the new tab (+) icon).
  2. In the HTTP verb drop-down list, select POST.
  3. For Enter request URL, enter https://login.microsoftonline.com/<tenant_id>/oauth2/token, where <tenant_id> is your Active Directory (tenant) ID.
  4. On the headers tab enter a new key Content-Type with value application/x-www-form-urlencoded.
  5. On the Body tab, select x-www-form-urlencoded and enter below key value pairs.grant_type:client_credentials client_id:<client_id> client_secret:<Client_secret> resource:https://management.azure.com/​

     

    abidguroo_19-1667418899120.png

     

  6. Click send. You will receive the bearer token in the json response to the request. Below is an example of the response.

 

 

{ "token_type": "Bearer", "expires_in": "", "ext_expires_in": "", "expires_on": "", "not_before": "", "resource": "https://management.azure.com/", "access_token": "<access token to be used for invoking REST APIs>" }

 

 

 

Note: The "resource" parameter value should be "https://management.azure.com/" or ""https://management.core.windows.net/" for control plane operations and "https://{workspace name}.dev.azuresynapse.net" for data plane operations.

 

Azure Synapse Analytics Apache Spark REST APIs

Apache Spark Pools in Azure Synapse Analytics supports the following REST API operations that can be invoked using a HTTP endpoint:

 

Operation

Description

Method

API Docs

Create Or Update

Create a new Apache Spark Pool or modify the properties of an existing pool.

PUT

Docs

Delete

Delete a Apache Spark Pool pool.

DELETE

Docs

Get

Get the properties of a Apache Spark Pool.

GET

Docs

List By Workspace

List all provisioned Apache Spark Pools in a workspace.

GET

Docs

Update

Update the properties of an existing Apache Spark pool.

PATCH

Docs

 

Detailed documentation of all the REST API operations supported overall by Azure Synapse Analytics can be found with this link

 

Example: How to invoke an Apache Spark REST endpoint to create or update a spark pool

The Create Or Update REST API can be used to create a new Apache Spark Pool or change configurations of an existing pool, including the ability to upgrade/downgrade the Spark runtime version of a pool. For example, an existing Apache Spark pool with spark runtime version 3.1 can be upgraded to Spark version 3.2 without the need of deletion of the existing pool.

 

On the Postman UI

  1. Create a new HTTP request (File > New > HTTP Request or using the new tab(+) icon).
  2. In the HTTP verb drop-down list, select PUT.
  3. For Enter request URL, enter https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Synapse/workspaces/{workspaceName}/bigDataPools/{bigDataPoolName}?api-version=2021-06-01-preview where {subscriptionId} is the subscription Id of where the Azure Synapse workspace has been provisioned, {resourceGroupName} is the resource group of the workspace, {workspaceName} is name of the Synapse workspace and {bigDataPoolName} is the name of the Apache spark pool.
  4. On the headers tab enter a new key Content-Type with value application/json.
  5. Click on the Authorization tab, select Bearer Token in the Type dropdown menu and enter the AAD Token generated in the previous example.
  6. On the Body tab, select raw and select json on the dropdown menu and enter target Spark Pool JSON definition in the body. 

abidguroo_20-1667418981546.png

 

Example: 

 

 

{ "location": "West US 2", "properties": { "sparkVersion": "3.2", "nodeCount": 6, "nodeSize": "Medium", "nodeSizeFamily": "MemoryOptimized", "autoScale": { "enabled": false, "minNodeCount": 6, "maxNodeCount": 6 }, "autoPause": { "enabled": true, "delayInMinutes": 30 } } }

 

 

 

Detailed list of configuration properties supported by Apache Spark Pools are documented in this link.

7. Click the send button. A successful creation or modification REST API operation will return a detailed provisioning JSON response like the below sample:

 

 

{ "properties": { "creationDate": "2022-10-28T21:03:07.2066667Z", "sparkVersion": "3.2", "nodeCount": 6, "nodeSize": "Medium", "nodeSizeFamily": "MemoryOptimized", "autoScale": { "enabled": false, "minNodeCount": 6, "maxNodeCount": 6 }, "autoPause": { "enabled": true, "delayInMinutes": 30 }, "isComputeIsolationEnabled": false, "sessionLevelPackagesEnabled": false, "cacheSize": 50, "dynamicExecutorAllocation": { "enabled": false }, "lastSucceededTimestamp": "2022-10-28T22:42:19.23Z", "isAutotuneEnabled": false, "provisioningState": "Provisioning" }, "id": "/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Synapse/workspaces/{workspaceName}/bigDataPools/{bigDataPoolName}", "name": "{bigDataPoolName}", "type": "Microsoft.Synapse/workspaces/bigDataPools", "location": "westus2" }

 

 

 

Note: You can include an optional Boolean(true/false) parameter in the request URL to stop all currently running sessions/job on the target Spark pool.

 

The new request URL with forced termination of sessions: https://management.azure.com/subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/Microsoft.Synapse/workspaces/{workspaceName}/bigDataPools/{bigDataPoolName}?api-version=2021-06-01-preview&force={true}

 

Summary

Azure Synapse Analytics REST APIs REST APIs can be used for operation of various Synapse services. The REST APIs can be used to upgrade/downgrade Spark runtime version, auto-scale configuration, library management and more.

 

Our team publishes blog(s) regularly and you can find all these blogs here: https://aka.ms/synapsecseblog 

 

For deeper level understanding of Synapse implementation best practices, please refer our Success By Design (SBD) site: https://aka.ms/Synapse-Success-By-Design 

 

Published on:

Learn more
Azure Synapse Analytics Blog articles
Azure Synapse Analytics Blog articles

Azure Synapse Analytics Blog articles

Share post:

Related posts

Azure SDK Release (November 2024)

The Azure SDKs release every month. This post includes the month's highlights and release notes. The post Azure SDK Release (November 2024) ap...

19 hours ago

Azure Database for PostgreSQL Flexible Server - Elastic Clusters, faster disks, and AI updates

Increase scalability, optimize performance, and integrate advanced AI features with Azure Database for PostgreSQL Flexible Server. Scale up wi...

6 days ago

Introducing the new Linux-based Azure Cosmos DB Emulator (Preview)

We are excited to announce the preview release of the new Linux-based Azure Cosmos DB Emulator! This latest version is built to provide faster...

8 days ago

Azure Cosmos DB Shines at Microsoft Ignite 2024!

Microsoft Ignite 2024 took over the Windy City this week, bringing with it new technological innovation and exciting product announcements apl...

8 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy