I brute forced #adventofcode day 8, part 1 in 3 hours. Part 2 is not looking good. I need to fix part 1 and make it more dynamic/smarter and then tackle part 2, I think. Part 1: Stand outside the forest and look inward. Part 2: Stand inside the forest and look outward. Can be consolidated to: Stand on a tree and look in each direction. The difficultly is real now. 🌲
I just completed "No Space Left On Device" - Day 7 - Advent of Code 2022 #AdventOfCode https://adventofcode.com/2022/day/7 https://github.com/seanluce/adventofcode2022 #PowerShell Thanks to @etb for the assist!
#dogsOfHachyderm Let’s see ‘em!
Stuck on Day 7 #adventofcode. Test input works as expected. Real input answer is too low. My hash table key is the full path of each directory so duplicate directory names shouldn’t be a problem. I must be doing something wrong with my recursion. I am newb.
Me taking a peak at the federated timeline…
Azure NetApp Files + Trident = Dynamic and Persistent Storage for Kubernetes
My eleven year old daughter recently asked me ’the question’. You know the one…
“Hey dadda, what’s Kubernetes?” 🔗
How do you explain kubernetes to an eleven year old?! Check out this video to learn how. Brilliant.
I am not a Kubernetes expert. I am learning, probably just like you are. Creating resources, tearing down resources, building them again, kubectl-ing, scratching my head, scratching my head some more, rinse and repeat. I can understand how powerful it is. I can also understand how complex it is.
As an engineer, I am constantly trying to distill complex things down to their simplest form. I figure out how to do something by reading documentation, blogs, dissecting code, and a humorous amount of trial and error. If I don’t distill this information and document it, I’ll have to do it all over again in a few months as my memory is just awful. I do enjoy this process, and I hope these types of posts can help you get to where you are going a tiny bit faster.
If you want to deploy or demonstrate dymamically provisioned, persistent storage without being a Kubernetes expert, this is the post for you. 🔗
Insomnia? REST with Azure NetApp Files: Part 2 of 3
I know you have all been waiting patiently for part two of this series. I’d like less time to go by in between these multi part blogs but sometimes life just gets in the way. In part one of this series, I showed you how to get started with Insomnia and how to get your ‘bearer token’. Now that we have these two boxes checked, we can begin to query the Azure Management API to get information about our Azure NetApp Files resources.
First, I’ll show you how to create a basic query to get a list of your NetApp Storage Accounts. For more information about the Azure NetApp Files storage heirarchy, head on over to docs.microsoft.com. Once we have a successful response to our basic query, we’ll create a more specific request and get some useful information about one of our Azure NetApp Files Volumes. Lastly, I’ll show you how to use cURL to query the API from the Linux command line. This can be really useful if you would like to integrate API calls into scripts or other automation tools.
Let’s get started.
Insomnia? REST with Azure NetApp Files: Part 1 of 3
Hello and welcome to the first post in a three part series that will help you get up and running with the Azure REST API. More specifically I’ll show you how to use the Azure API to interact with your Azure NetApp Files resources.
Getting started with REST APIs can be a little tricky. There are several components to a REST API call. Combine that with the various types of authentication and things can get pretty overwhelming. Microsoft has chosen to use what is called ‘bearer token’ as the authentication method for their Azure Management API.
I’ll call this first part ‘bearer of good tokens’.
How to: Enable Snapshot Policies for Azure NetApp Files
But of course, Microsoft and NetApp continue to bring new features and more value to this great service. There is one tiny caveat… at this time, the snapshot policy feature is currently in preview. But don’t worry, the registration process is painless. We’ll have you creating snapshot policies in just a few minutes.
As a prerequisite, you’ll need Azure PowerShell installed and connected to your Azure account:
Install-Module -Name Az -AllowClobber -Scope CurrentUser
Once you have successfully connected to your Azure account, continue with registering the ‘ANFSnapshotPolicy’ feature:
Register-AzProviderFeature -ProviderNamespace Microsoft.NetApp -FeatureName ANFSnapshotPolicy
Lastly, verify the feature is ‘Registered’:
Get-AzProviderFeature -ProviderNamespace Microsoft.NetApp -FeatureName ANFSnapshotPolicy
Automatic Alerts for Azure NetApp Files
Controlling cloud spend is a massive challenge for cloud consumers of all sizes. Having good systems in place to monitor resources and provide proper alerting mechanisms when that consumption goes beyond expected levels is critical to any successful cloud deployment.
While Azure NetApp Files is a great enterprise grade file service, the Azure metrics that we have to work with today can be a bit limiting when it comes to capacity consumption. One of the current shortcomings is the absence of a metric that allows you to alert on a percentage of space consumed at a capacity pool or volume level. Currently, the alert threshold needs to be specified in bytes.. bytes?! Yes, BYTES. Furthermore, when a capacity pool or volume is resized, the corresponding alert threshold needs to be manually increased or decreased accordingly.