Retrieve Azure Synapse role-based access control (RBAC) Information using PowerShell
Azure Synapse Analytics is a limitless analytics service that brings together data integration, enterprise data warehousing and big data analytics. It gives you the freedom to query data on your terms, using either serverless or dedicated resources—at scale. Azure Synapse brings these worlds together with a unified experience to ingest, explore, prepare, manage and serve data for immediate BI and machine learning needs.
Synapse RBAC extends the capabilities of Azure RBAC for Synapse workspaces and their content.
Synapse RBAC is used to manage who can:
- Publish code artifacts and list or access published code artifacts,
- Execute code on Apache Spark pools and Integration runtimes,
- Access linked (data) services protected by credentials
- Monitor or cancel job execution, review job output, and execution logs.
Azure Synapse RBAC has built-in roles and scopes that helps to manage permissions in Azure Synapse Analytics -
Role |
Permissions |
Scopes |
Synapse Administrator |
Full Synapse access to SQL pools, Data Explorer pools, Apache Spark pools, and Integration runtimes. Includes create, read, update, and delete access to all published code artifacts. Includes Compute Operator, Linked Data Manager, and Credential User permissions on the workspace system identity credential. Includes assigning Synapse RBAC roles. In addition to Synapse Administrator, Azure Owners can also assign Synapse RBAC roles. Azure permissions are required to create, delete, and manage compute resources.
Can read and write artifacts Can do all actions on Spark activities. Can view Spark pool logs Can view saved notebook and pipeline output Can use the secrets stored by linked services or credentials Can assign and revoke Synapse RBAC roles at current scope |
Workspace Spark pool Integration runtime Linked service Credential |
Synapse Apache Spark Administrator |
Full Synapse access to Apache Spark Pools. Create, read, update, and delete access to published Spark job definitions, notebooks and their outputs, and to libraries, linked services, and credentials. Includes read access to all other published code artifacts. Doesn't include permission to use credentials and run pipelines. Doesn't include granting access.
Can do all actions on Spark artifacts Can do all actions on Spark activities |
Workspace Spark pool |
Synapse SQL Administrator |
Full Synapse access to serverless SQL pools. Create, read, update, and delete access to published SQL scripts, credentials, and linked services. Includes read access to all other published code artifacts. Doesn't include permission to use credentials and run pipelines. Doesn't include granting access.
Can do all actions on SQL scripts Can connect to SQL serverless endpoints with SQL db_datareader, db_datawriter, connect, and grant permissions |
Workspace |
Synapse Contributor |
Full Synapse access to Apache Spark pools and Integration runtimes. Includes create, read, update, and delete access to all published code artifacts and their outputs, including credentials and linked services. Includes compute operator permissions. Doesn't include permission to use credentials and run pipelines. Doesn't include granting access.
Can read and write artifacts Can view saved notebook and pipeline output Can do all actions on Spark activities Can view Spark pool logs |
Workspace Spark pool Integration runtime |
Synapse Artifact Publisher |
Create, read, update, and delete access to published code artifacts and their outputs. Doesn't include permission to run code or pipelines, or to grant access.
Can read published artifacts and publish artifacts Can view saved notebook, Spark job, and pipeline output |
Workspace |
Synapse Artifact User |
Read access to published code artifacts and their outputs. Can create new artifacts but can't publish changes or run code without additional permissions. |
Workspace |
Synapse Compute Operator |
Submit Spark jobs and notebooks and view logs. Includes canceling Spark jobs submitted by any user. Requires additional use credential permissions on the workspace system identity to run pipelines, view pipeline runs and outputs.
Can submit and cancel jobs, including jobs submitted by others Can view Spark pool logs |
Workspace Spark pool Integration runtime |
Synapse Monitoring Operator |
Read published code artifacts, including logs and outputs for notebooks and pipeline runs. Includes ability to list and view details of serverless SQL pools, Apache Spark pools, Data Explorer pools, and Integration runtimes. Requires additional permissions to run/cancel pipelines, Spark notebooks, and Spark jobs. |
Workspace |
Synapse Credential User |
Runtime and configuration-time use of secrets within credentials and linked services in activities like pipeline runs. To run pipelines, this role is required, scoped to the workspace system identity.
Scoped to a credential, permits access to data via a linked service that is protected by the credential (also requires compute use permission) Allows execution of pipelines protected by the workspace system identity credential(with additional compute use permission) |
Workspace Linked Service Credential |
Synapse Linked Data Manager |
Creation and management of managed private endpoints, linked services, and credentials. Can create managed private endpoints that use linked services protected by credentials |
Workspace |
Synapse User |
List and view details of SQL pools, Apache Spark pools, Integration runtimes, and published linked services and credentials. Doesn't include other published code artifacts. Can create new artifacts but can't run or publish without additional permissions.
Can list and read Spark pools, Integration runtimes. |
Workspace, Spark pool Linked service Credential |
There are multiple ways that the RBAC roles can be configured.
The easiest and most friendly way is to perform this action using Azure Synapse workspace. (more)
In PowerShell, there are number of PowerShell cmdlets that helps to manage or retrieve the information in Synapse RBAC.
Example:
- Get-AzSynapseRoleAssignment - Gets a Synapse Analytics role assignment. (more)
- New-AzSynapseRoleAssignment - Creates a Synapse Analytics role assignment. (more)
- Remove-AzSynapseRoleAssignment - Deletes a Synapse Analytics role assignment. (more)
One of the key use cases that most customers face difficulties while retrieving or assigning the role-based access control in Azure Synapse Analytics, that they cannot find the correct usernames, group names or the service principal names using the PowerShell cmdlet "Get-AzSynapseRoleAssignment". The PowerShell cmdlet only provides limited information and it's difficult to understand since that contains the object IDs.
Example: (Following output is captured from “Get-AzSynapseRoleAssignment”)
In order to retrieve the additional information, users can have following example scripts that provides the username and other relevant information for Azure Synapse RBAC.
Get all the Synapse RBAC Information:
Below PowerShell script helps to map the RBAC Object IDs with usernames, groups and the service principals.
The script output provides all the RBAC information in Azure Synapse Analytics Workspace.
Note: This is only an example to retrieve the information and not to use any production code
Script Name: GetSynapseRBACInfo.ps1
Example: (Following is the output of above script)
Get Specific user RBAC information:
Below PowerShell script helps to map a specific user with Synapse RBAC.
The script provides the information for a specific username, group or a service principal.
Note: This is only an example to retrieve the information and not to use any production code
Script Name: GetSynapseRBACUser.ps1
Example: (Following output is the output of above script)
How to execute the scripts?
There are multiple ways you can execute the scripts.
- From local host -
Copy the commands to a PowerShell script
Rename the PowerShell scripts as "GetSynapseRBACInfo.ps1" & "GetSynapseRBACUser.ps1"
Execute the PowerShell script
Note: Az.Synapse module & Az.Resources modules needs to be installed - From Azure Cloud Shell.
Upload both scripts to Azure Cloud Shell
Execute the scripts as #1.
The workspace name, AAD username, AAD user group or Service principal name is required as parameters.
Example: (Following output captured while the script is requesting the necessary information)
Published on:
Learn moreRelated posts
Azure Developer CLI (azd) – February 2025
This post announces the February release of the Azure Developer CLI (`azd`). The post Azure Developer CLI (azd) – February 2025 appeared...
Using Azure AI Foundry SDK for your AI apps and agents
Design, customize and manage your own custom applications with Azure AI Foundry right from your code. With Azure AI Foundry, leverage over 1,8...
Azure Data Studio Retirement
We’re announcing the upcoming retirement of Azure Data Studio (ADS) on February 6, 2025, as we focus on delivering a modern, streamlined SQL d...
Microsoft Copilot (Microsoft 365): Developers building agents in Azure AI Foundry can ground their agent in files stored in SharePoint
Tenant Graph grounding provides grounding for agents with relevant information from Microsoft Graph, including files stored in SharePoint. Thi...
Smarter Features, Greater Productivity with Azure Maps in Dynamics 365 CRM
“Every step toward efficiency is a step toward success.” At Maplytics, we believe that innovation is not a choice but a continuous...
Use Azure Cosmos DB as a Docker container in CI/CD pipelines
There are lot of benefits to using Docker containers in CI/CD pipelines, especially for stateful systems like databases. For example, when you...
Introducing Azure OpenAI Realtime API Support in JavaScript
Introducing the new Realtime API support in the OpenAI JavaScript library, enabling developers to create highly interactive and responsive app...