Loading...

Azure HPC OnDemand Platform: Cloud HPC made easy.

Azure HPC OnDemand Platform: Cloud HPC made easy.

As many customers are looking at running their HPC workloads in the cloud, onboarding effort and cost are key consideration.  As an HPC administrator, in such process you try to provide a unified user experience with a minimal disruption, in which the end users and the cluster administrators can retrieve most of their on-premises environment while leveraging the power of running in the cloud. 

 

The Specialized Workloads for Industry and Mission team that works on some of the most complex HPC customer and partner scenarios has built a solution accelerator Azure HPC OnDemand Platform (aka az-hop) available in the Azure/az-hop public GitHub repository to help our HPC customers onboard faster. az-hop delivers a complete HPC cluster solution ready for users to run applications, which is easy to deploy and manage for HPC administrators. az-hop leverages the various Azure building blocks and can be used as-is, or easily customized and extended to meet any uncovered requirements.

 

Based on our experience, from years of customer engagements, we have identified some common principles that are important to our customers and designed az-hop with these in mind:

  • A pre-packaged HPC Cluster easy to deploy in an existing subscription, which contains all the key building blocks and best practices to run a production HPC environment in Azure,
  • A unified and secured access for end users and administrators, so each one can reuse their on-premises tools and scripts,
  • A solution to integrate applications under the same unified cloud experience,
  • Build on standards, common tools and open blocks so it can be easily extended and customized to accommodate the unique requirements of each customer.

 

Picture1.gif

 

The HPC end-user workflow typically comprises of 3 steps –  

 

Step 

Details 

Key Features needed 

Prepare Model 

In this step, the user would get the data to be used by the application. 

Fast data transfer and a home directory where they can upload their data, scripts etc. 

Run Job 

Using their shell session or UI user would submit their job providing details on the slot type and number of nodes they would need for running the job. 

Auto-scale compute, scheduler, scratch storage. 

Analyze results 

Once the job is finished, the user can visualize their results. 

Interactive desktop 

 

The below diagram depicts the components needed in a typical on-premise environment to support this workflow.

 

xpillons_0-1626078274332.png

 

The default az-hop environment supports the above workflow with the following architecture, all accessed from the OnDemand portal for unified access and only with HTTPS for end users and SSH/HTTPS for administrators.

 

Picture1.jpg

 

 

The unified experience is provided by the Open OnDemand web portal from the Ohio Supercomputer Center. Listed below are some of the features that the current az-hop environment supports but you can see the releases as we add more features:

  • Authentication is managed by Active Directory,
  • Job submission in CLI or web UI thru OpenPBS,
  • Dynamic resources provisioning and autoscaling is done by Azure Cycle Cloud, pre-configured job queues and integrated health-checks to quickly avoid non-optimal nodes
  • A common shared file system for home directory and applications is delivered by Azure Netapp Files,
  • A Lustre parallel filesystem using local NVME for high performance that automatically archives to Azure Blob Storage using the Robinhood Policy Engine and Azure Storage data mover,
  • Monitoring dashboards are exposed in Grafana,
  • Remote Visualization with noVNC and GPU acceleration with VirtualGL.

The whole solution is defined in a single configuration file and deployed with Terraform. Ansible playbooks are used to apply the configuration settings and application packages installation. Packer is used to build the two main custom images for compute nodes and for remote visualization, published into an Azure Shared Image gallery.

 

The instructions to deploy your az-hop environment are available from this page. The az-hop GitHub comes with some example tutorials to demonstrate how you can integrate and run your applications in the az-hop environment and you can follow them here to give it a test drive or just simply run your own.

 

 

 

Published on:

Learn more
Azure Global articles
Azure Global articles

Azure Global articles

Share post:

Related posts

How to Build a Pipeline for Exact Matching in Azure ML Using Python Script

Exact matching is a critical process for identifying precise matches between text data and predefined keywords. In this blog, we’ll walk you t...

14 hours ago

Integrate Dataverse Azure solutions – Part 2

Dataverse that help streamline your integrations, such as Microsoft Azure Service Bus, Microsoft Azure Event Hubs, and Microsoft Azure Logic A...

7 days ago

Dynamics 365 CE Solution Import Failed in Azure DevOps Pipelines

Got the below error while importing Dynamics CRM Solution via Azure DevOps Pipeline. 2024-12-18T23:14:20.4630775Z ]2024-12-18T23:14:20.74...

8 days ago

Dedicated SQL Pool and Serverless SQL in Azure: Comparison and Use Cases

Table of Contents Introduction Azure Synapse Analytics provides two powerful SQL-based options for data processing: Dedicated SQL Pools and Se...

8 days ago
Stay up to date with latest Microsoft Dynamics 365 and Power Platform news!
* Yes, I agree to the privacy policy