Loading...

Capturing Azure Storage Explorer REST API calls

Capturing Azure Storage Explorer REST API calls

Azure Storage Explorer is a powerful tool that simplifies working with Azure Storage services such as Blob, Queue, Table, and File storage. While it provides a user-friendly interface, there may be times when you need to inspect the requests it sends and the responses it receives from Azure. In such cases, Fiddler, a widely used web debugging proxy, can come to your rescue.

 

This step-by-step guide will walk you through the process of capturing a trace that you can use to evaluate such requests and responses using Fiddler.

 

Prerequisites:

  1. Install Azure Storage Explorer: Download and install the Azure Storage Explorer tool from the official Microsoft website.
  2. Install Fiddler: Download and install Fiddler Classic from the Telerik website.

 

Step 1: Launch Fiddler:

  1. After installing Fiddler, launch the application.
  2. As soon as you do that, you will see the following. Please note that at that point Fiddler will be already capturing HTTP/HTTPS requests, you can confirm this through the "Capturing" status in the taskbar at the bottom left corner:
    fiddler.png
  3. Since you won't need to do any capturing at this point, go ahead and click on the "Capturing" text, after doing that the text should disappear, confirming that the tool is not capturing traffic anymore:
    fiddler-not capturing.png

 

Step 2: Configure Fiddler for HTTPS requests capture

To ensure Fiddler captures traffic from Azure Storage Explorer, follow these steps:

  1. Click on the "Tools" menu in Fiddler.
  2. Select "Options"
    options.png
  3. In the Options window, go to the "HTTPS" tab
    https.png
  4. Check the "Capture HTTPS CONNECTs" and "Decrypt HTTPS traffic" options
    https2.png
  5. Now, as soon as you check the "Decrypt HTTPS traffic" option, you will see the following warning, to be able to intercept the traffic from the Azure Storage Explorer tool, you will need to trust this root certificate, hence, you'll need to click on the "Yes" button:

    warning.png

    certificate.png
  6. After doing that, the OS will show you two additional warnings. You'll again need to click on the "Yes" button:
    certificate2.png
    certificate3.png
  7. After doing that, you should see the following confirmation:
    certificate4.png
  8. Click the "OK" button to close the confirmation message
  9. Then click on the "Actions" button, and select the "Export Root Certificate to Desktop" option:
    certificate5.png
  10. Click the "OK" button to close the confirmation message
    certificate6.png
  11. Click the "OK" button twice to close the confirmation message and the "Options" window where you enabled the HTTPS traffic capture

 

Step 3: Import the Fiddler certificate to the Azure Storage Explorer tool

  1. Open the Azure Storage Explorer tool
    ase.png
  2. Click on the "Edit" menu.
  3. Select the "SSL Certificates"->"Import Certificates" option
    ase2.png
  4. Select the "FiddlerRoot.cer" file and click on the "Open" button
    ase3.png
  5. At that point you should see the following confirmation message at the top: "Successfully imported 1 of 1 certificates. Storage Explorer must restart for the changes to take effect."
    ase4.png
  6. Go ahead and click on the "Restart Now" button and wait for the tool to come back

 

Step 4: Configure the Proxy settings for the requests being sent through the Azure Storage Explorer tool to be routed to Fiddler

  1. Now that the Azure Storage Explorer tool is back, click on the "Edit" menu
  2. Select the "Configure proxy" option
    ase5.png
  3. Once there, make sure that the "Source" option selected is "Use system proxy":
    ase6.png
  4. Click on the "OK" button
  5. At this point you should have everything you need for Fiddler to capture and decrypt the HTTPS traffic coming from the Azure Storage Explorer tool

warning.png

 

Step 5: Start capturing traffic

Now, let's start capturing traffic from Azure Storage Explorer.

  1. If you haven't yet, sign in and connect to your Azure Subscription or Storage Account

  2. Expand the containers within the Storage Account so that we can focus on that operation, in my case I can see the "$logs", "$web", "blobinventory", "containerinventory", and a few other containers:
    ase7.png
  3. Since Fiddler was already set up to capture the traffic, go back to Fiddler

 

Step 6: Inspect Captured Traffic in Fiddler

  1. Switch back to Fiddler and start capturing traffic again by clicking on the place where you saw the "Capturing" text before, at the bottom left corner of the application:
    fiddler-capturing.png
  2. After re-enabling the traffic capture, you should see the text "Capturing":
    fiddler.png
  3. You'll then see how the app starts showing the list of captured requests in the left-hand panel. These correspond to any HTTP/HTTPS requests being sent from your working machine.
  4. At that point, go ahead and execute an operation from the Azure Storage Explorer tool and against your Storage Account
  5. Fiddler should show those requests as part of the list that was already showing after re-enabling the capture:
    ase8.png
  6. Now, a little trick that I always use is to identify any request coming from the "storageexplorer" process, so that I can apply a filter to see only the requests coming from it.
  7. To apply the filter, you can just right click on the request you are interested in, and then select the "Filter Now"->"Show Only Process={PID}" option, 37504 in my case:
    ase9.png
  8. After doing that, you will see only the requests coming from that process, which will make things a bit easier. Notice the filter being shown at the bottom left corner:
    ase10.png
  9. At that point, you can just double click on any of the requests, which will make Fiddler show the request and response details. In my case, I selected the request to get a list of containers within my Storage Account. Here you can see that the Azure Storage Explorer tool sent a GET request, for which it got a HTTP 200 response, showing among other things the x-ms-request-id header value, which is typically very useful when contacting the Microsoft Support team for them to provide some technical support.
    ase11.png
  10. Since the response that I got for this request was of type XML, I'm able to see the actual response body after clicking on the XML tab:
    ase12.png
  11. And that's it, that's how you capture requests coming from the Azure Storage Explorer tool

warning.png

 

Bonus tip #1!

You can also enable verbose logging within the Azure Storage Explorer tool to see some of this information, the output will be in the log files and the entries may not be as friendly as what you can see through Fiddler, but it's still there nevertheless. To achieve this, all you need to do is to go to the settings and then look for the "Log Level" section, once there, make sure to set it to either "Debug" or "Trace":

ase13.png

Once that's done and after executing an operation against a Storage Account, you can just click on the "Help" menu, and select the "Open Logs Directory" option:

ase14.png

In my case, this is what I see regarding the REST API call to get the information on my Storage Account:

Name of the file in my case: 2023-10-12_115134_storage-account-extension_30988.log
ase15.png

 

I hope this helps on capturing these traces in times when you need to do some troubleshooting related to the HTTPS requests that the Azure Storage Explorer tool sends to Azure.

 

References

=======

Azure Storage Explorer troubleshooting guide
https://learn.microsoft.com/en-us/troubleshoot/azure/azure-storage/storage-explorer-troubleshooting

Capture web requests with Fiddler

https://learn.microsoft.com/en-us/power-query/web-connection-fiddler

Delete a Certificate

https://learn.microsoft.com/en-us/previous-versions/windows/it-pro/windows-server-2008-R2-and-2008/cc772354(v=ws.11)

Published on:

Learn more
Azure PaaS Blog articles
Azure PaaS Blog articles

Azure PaaS Blog articles

Share post:

Related posts

Boost your Azure Cosmos DB Efficiency with Azure Advisor Insights

Azure Cosmos DB is Microsoft’s globally distributed, multi-model database service, trusted for mission-critical workloads that demand high ava...

1 day ago

Microsoft Azure Fundamentals #5: Complex Error Handling Patterns for High-Volume Microsoft Dataverse Integrations in Azure

🚀 1. Problem Context When integrating Microsoft Dataverse with Azure services (e.g., Azure Service Bus, Azure Functions, Logic Apps, Azure SQ...

1 day ago

Using the Secret Management PowerShell Module with Azure Key Vault and Azure Automation

Automation account credential resources are the easiest way to manage credentials for Azure Automation runbooks. The Secret Management module ...

2 days ago

Microsoft Azure Fundamentals #4: Azure Service Bus Topics and Subscriptions for multi-system CRM workflows in Microsoft Dataverse / Dynamics 365

🚀 1. Scenario Overview In modern enterprise environments, a single business event in Microsoft Dataverse (CRM) can trigger workflows across m...

2 days ago

Easily connect AI workloads to Azure Blob Storage with adlfs

Microsoft works with the fsspec open-source community to enhance adlfs. This update delivers faster file operations and improved reliability f...

3 days ago

Microsoft Azure Fundamentals #3: Maximizing Event-Driven Architecture in Microsoft Power Platform

🧩 1. Overview Event-driven architecture (EDA) transforms how systems communicate.Instead of traditional request–response or batch integration...

3 days ago

Azure Developer CLI (azd) – October 2025

This post announces the October release of the Azure Developer CLI (`azd`). The post Azure Developer CLI (azd) – October 2025 appeared f...

4 days ago

Microsoft Azure Fundamentals #2: Designing Real-Time Bi-Directional Sync Between Dataverse and Azure SQL for Multi-Region Deployments

Here’s a detailed technical breakdown of designing a real-time bi-directional sync between Dataverse and Azure SQL for multi-region deployment...

4 days ago

Azure DevOps local MCP Server is generally available

Today we are excited to take our local MCP Server for Azure DevOps out of preview 🥳. Since the initial preview announcement, we’ve work...

5 days ago

Announcing the new Azure DevOps Server RC Release

We’re excited to announce the release candidate (RC) of Azure DevOps Server, bringing new features previously available in our hosted version....

11 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy