Hello and welcome to the first post in a three part series that will help you get up and running with the Azure REST API. More specifically I’ll show you how to use the Azure API to interact with your Azure NetApp Files resources.

Getting started with REST APIs can be a little tricky. There are several components to a REST API call. Combine that with the various types of authentication and things can get pretty overwhelming. Microsoft has chosen to use what is called ‘bearer token’ as the authentication method for their Azure Management API.

I’ll call this first part ‘bearer of good tokens’.

This ‘bearer token’ is unique to you and your Azure subscription. It needs to be passed as part of your REST API call in order to prove to Azure that you are authorized to interact with your Azure resources. This token should be treated as a very sensitive bit of information. Keep it in a secure place and don’t accidently commit it to a public code repository. (been there, done that!)

“This sounds exciting, how do I get my bear token!?”

Erm… that’s ‘bearer token’ and great question! Microsoft has made this quite easy and I have broken it down in to three easy steps:

  1. Create an Azure ‘service principal’
  2. Install and Configure Insomnia Core (REST API client)
  3. Issue a POST request to https://login.microsoftonline.com

Let’s dive in to each of these a little bit deeper…

1. Create an Azure ‘service principal’

I think the easiest way to do this is to use the Azure CLI (az cli):

az ad sp create-for-rbac --name "AzureNetAppFilesSP"

If you are not familiar with the Azure CLI, go check out this getting started guide.

The output (in JSON) should look something like this:

{
 "appId": "11111111-1111-1111-1111-111111111111",
 "displayName": "AzureNetAppFilesSP",
 "name": "http://AzureNetAppFilesSP",
 "password": "22222222-2222-2222-2222-222222222222",
 "tenant": "33333333-3333-3333-3333-333333333333"
}

We’ll need some of this information a bit later.

Continue reading

Snapshot policies are here! Until today, if you wanted to schedule or automate the creation (and retention) of snapshots in the Azure NetApp Files service, you needed to BYOA (bring your own automation). Like any first-party Azure service, ANF supports all of the standard Azure APIs and automation tools, so this wasn’t terribly difficult and there was even a Logic App that was simple to deploy and took care of the heavy lifting.

But of course, Microsoft and NetApp continue to bring new features and more value to this great service. There is one tiny caveat… at this time, the snapshot policy feature is currently in preview. But don’t worry, the registration process is painless. We’ll have you creating snapshot policies in just a few minutes.

As a prerequisite, you’ll need Azure PowerShell installed and connected to your Azure account:

Install-Module -Name Az -AllowClobber -Scope CurrentUser
Connect-AzAccount

Once you have successfully connected to your Azure account, continue with registering the ‘ANFSnapshotPolicy’ feature:

Register-AzProviderFeature -ProviderNamespace Microsoft.NetApp -FeatureName ANFSnapshotPolicy

Lastly, verify the feature is ‘Registered’:

Get-AzProviderFeature -ProviderNamespace Microsoft.NetApp -FeatureName ANFSnapshotPolicy


The registration process should take about ten minutes, but your experience may vary slightly. At this point, you should see the ‘Data protection’ sub-heading and ‘Snapshot policy’ menu item within the Azure portal (take a look at the screen shot below). You are now ready to create your first snapshot policy. Head on over to the official Azure NetApp Files documentation to get started.

Continue reading

Controlling cloud spend is a massive challenge for cloud consumers of all sizes. Having good systems in place to monitor resources and provide proper alerting mechanisms when that consumption goes beyond expected levels is critical to any successful cloud deployment.

While Azure NetApp Files is a great enterprise grade file service, the Azure metrics that we have to work with today can be a bit limiting when it comes to capacity consumption. One of the current shortcomings is the absence of a metric that allows you to alert on a percentage of space consumed at a capacity pool or volume level. Currently, the alert threshold needs to be specified in bytes.. bytes?! Yes, BYTES. Furthermore, when a capacity pool or volume is resized, the corresponding alert threshold needs to be manually increased or decreased accordingly.

After a few conversations with customers and colleagues, I decided to see if I could automate the creating and updating of Azure monitor alert rules for Azure NetApp Files resources. Inspired by my good friend, Kirk Ryan’s anfScheduler Logic App, I thought that may be a good place to start.

And that is how ANFAutoAlerts was born…

ANFAutoAlerts is an Azure Logic App that automates the creating, updating, and deleting of capacity based alerts for Azure NetApp Files.

Continue reading

Hello and welcome! I guess this is seanluce.com 2.0. 3? 4? I’ve lost count. I felt like it was time for a fresh start. So this is it. A clean slate. With a new start comes a new content platform. I kicked the tires on a few static site generators like Jekyll, Gatsby, and Pelican (you can see a pretty exhaustive list at staticgen.com if you are curious). I eventually landed on Hugo for static site generation and Netlify to build, test, and deploy. I am still getting used to the workflow, but it goes something like this:

Continue reading

Author's picture

Sean Luce

Cloud Solutions Architect @NetApp

Cloud Solutions Architect @NetApp

Michigan