powershell scripting

Azure – Expand OS Drive for Azure Virtual Machine

When you create an Azure Virtual Machine, it comes with an OS drive having a default size. The size is different for Windows and Linux machines. ~128GB and ~30GB for Windows and Linux machines respectively. The OS drive can be expanded to a maximum size of 2TB as of this writing. Use the below code to allocate space to your Azure VM’s OS drive. After executing the code, login to the virtual machine and expand (Disk Management for Windows) the drive using the newly allocated space.

Resize a Managed Disk:

## Resize OS disk size for Managed Disk

# Set Resource Group and Virtual Machine name
$resourceGroupName = 'my_resource_group_name'
$virtualMachineName = 'my_virtual_machine_name'

# Create VM reference object
$vm = Get-AzureRmVM -ResourceGroupName $resourceGroupName -Name $virtualMachineName

# Stop the Virtual Machine
Stop-AzureRmVM -ResourceGroupName $resourceGroupName -Name $virtualMachineName

# Create a reference to the Managed disk and set the size as required
$disk= Get-AzureRmDisk -ResourceGroupName $resourceGroupName -DiskName $vm.StorageProfile.OsDisk.Name
$disk.DiskSizeGB = 1023
Update-AzureRmDisk -ResourceGroupName $resourceGroupName -Disk $disk -DiskName $disk.Name

# Start the Azure Virtual Machine
Start-AzureRmVM -ResourceGroupName $resourceGroupName -Name $virtualMachineName

Resize an Unmanaged Disk:

## Resize OS disk size for Unmanaged Disk

# Set Resource Group and Virtual Machine name
$resourceGroupName = 'my-resource-group-name'
$virtualMachineName = 'my-vm-name'

# Create VM reference object
$vm = Get-AzureRmVM -ResourceGroupName $resourceGroupName -Name $virtualMachineName

# Stop the Virtual Machine
Stop-AzureRmVM -ResourceGroupName $resourceGroupName -Name $virtualMachineName

# Create a reference to the Unmanaged disk and set the size as required
$vm.StorageProfile.OSDisk.DiskSizeGB = 1023
Update-AzureRmVM -ResourceGroupName $resourceGroupName -VM $vm

# Start the Azure Virtual Machine
Start-AzureRmVM -ResourceGroupName $resourceGroupName -Name $virtualMachineName

Click here to download my PowerShell scripts for Free !!

Click here for Azure tutorial videos !!

Advertisement

Azure – Generate report for unattached Azure disks (managed and un-managed)

When you delete a virtual machine (VM) in Azure, by default, any disks that are attached to the VM aren’t deleted. This feature helps to prevent data loss due to the unintentional deletion of VMs. After a VM is deleted, you will continue to pay for unattached disks.

Unattached MANAGED disks:

When a managed disk is attached to a VM, the ManagedBy property contains the resource ID of the VM. When a managed disk is unattached, the ManagedBy property is null. The script examines all the managed disks in an Azure subscription. When the script locates a managed disk with the ManagedBy property set to null, the script determines that the disk is unattached.

Unattached UN-MANAGED disks:

When an unmanaged disk is attached to a VM, the LeaseStatus property is set to Locked. When an unmanaged disk is unattached, the LeaseStatus property is set to Unlocked. The script examines all the unmanaged disks in all the Azure storage accounts in an Azure subscription. When the script locates an unmanaged disk with a LeaseStatus property set to Unlocked, the script determines that the disk is unattached.

SCRIPT:

Download the script here

PowerShell script to generate a report of unattached VHD disks. This script will create two files – unattached_managed_disks.csv, unattached_un_managed_disks.csv

These two files will contain details about VHD files that are not attached to an Azure virtual machine.

NOTE: You have to login into your account before running the script. “login-azurermaccount” to log in to your account.

You can use the generated CSV to better manage your Azure infrastructure. Understand why the disks are not in use and take an informed decision on whether you want to delete or re-use them. Thus helping you to identify resources that are not being utilized and to reduce cost.

Click here to download my PowerShell scripts for Free !!

Click here for Azure tutorial videos !!

Powershell – Extract user list from Azure Active Directory to an excel file

This script will authenticate to your Azure Active Directory and fetch all the user details. Finally, it will save the details to the excel sheet.

Below is the link to the script:

https://gallery.technet.microsoft.com/scriptcenter/Extract-user-list-from-6cb9a93c

Below are the user attributes the script fetches:

1. Display Name

2. Object ID

3. Type

4. Principal Name

5. Role Name

6. Role Description

The excel sheet is saved as: C:\AzureADUserList\AzureADUserList.xlsx

Pre-Requisites: This script needs ‘MSOnline’ and ‘AzureRM’ PowerShell modules

Click here to download my PowerShell scripts for Free !!

 

 

PowerShell – Generate Azure PaaS Inventory

This PowerShell script helps you to maintain an inventory sheet of your Azure Platform-As-A-Service services. So that you can refer to them anytime you want.

Also, it serves a quick way to generate a report when your client needs to have a quick look at their PaaS services.

Below is the flow

  • The script logins to your Azure account and fetches the details of your Azure PaaS resources – Azure CDN and Azure WebApps.
  • It creates one worksheet for each Azure resource.
  • The user is prompted to select the subscription.Powershell Exception handling is implemented.
  • The user is again prompted if he wishes to view the excel sheet once the script is finished running.

** The script assumes you are using Powershell v5.0 and have excel module. (Basically, you should have MSOffice installed)

Below is the link to the script:

https://gallery.technet.microsoft.com/scriptcenter/Azure-PaaS-inventory-using-d1872989

I have also written script to generate:

Azure IaaS Inventory

AWS IaaS Inventory

Click here to download my PowerShell scripts for Free !!

 

 

IBM SoftLayer – List, Sync, and Download data from IBM COS S3 bucket

This blog can be treated as an extension to my blog on “Delete IBM COS S3 files older than X days”

Today I shall be sharing PowerShell scripts to List, Sync and Download data from IBM COS S3 bucket.

Pre-requisite is to have the AWS CLI modules installed on the machine.

List all the objects in a bucket

  1. The script will expect AccessKey, SecretKey, and BucketName.
  2. It creates a log file to log the flow of the script.
  3. Make sure you set the endpoint to $endpoint variable. [This can also be made as a parameter to make the script more dynamic]
  4. The script then dynamically sets the “AWS CLI profile” with profile name “test.”
  5. Execute the AWS CLI command, and un-set the profile.
<#
This script will list all the objects in a bucket.
EXAMPLE:
.\list_backup_files.ps1 -AccessKey ‘ZVS9wEvUUYSG8f’ -SecretKey ‘ENOpkewRzAoHGWnvulL1KbNOIRp7rpWidkV’ -BucketName ‘abc-abc-abc’
#>
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abc.objstor.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Listing files from IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “”
aws –endpoint-url $endpoint s3 ls $bucket_name –recursive –human-readable –summarize –output json –profile test
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
Write-Host “Command that failed was: ” $cmd
}
}

 

Sync files between local location and IBM COS S3 bucket

The script expects a local directory path which has to be synced with the IBM COS S3 bucket.

<#
This script will list all the objects in a bucket.
EXAMPLE:
.\sync_from_local_to_s3.ps1 -AccessKey ZVS9wEvUUYSG8f -SecretKey ENOpkewRzAoHGWnrpWidkV -BucketName abc-abc-abcc -InputPath “C:\manju”
#>
param(
[String]$AccessKey = ”,
[String]$SecretKey = ”,
[String]$BucketName = ”,
[String]$InputPath = “”
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\sync_from_local_to_s3_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abcc.abccc.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Sync files between local path and IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “aws –endpoint-url $endpoint s3 sync $InputPath $BucketName –profile test”
aws –endpoint-url $endpoint s3 sync $InputPath s3://abc11-abc-abc –profile test ## Hardcoding the bucket name. Expects <S3Uri>. Throws errror if BucketName is passed as string.
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
$cmd | out-file $logfile -Append
}
}

 

Download files from IBM COS S3 bucket to local system

This script expects

  1. InputPath. This is a text file, that contains list of file names to be downloaded.
  2. OutputPath. This is an absolute file path where files will be downloaded
  3. FileNameToDownload. You can alternatively specify file name (comma seperated) to be downloaded. This is useful when you have 3-4 files to be downloaded. Providing file names is easier than creating a text file for a small number of files.
# This script will download backup files form IBM COS S3 bucket.
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName,
[String]$InputPath, # Contains list of file names to download
[String]$OutputPath, # Files will be downloaded to this folder
[String]$FileNameToDownload
)
<#
EXAMPLE:
param(
[String]$AccessKey = “xxxxxxxxxxxxxxxxxxxxxxxxxx”,
[String]$SecretKey = “xxxxxxxxxxxxxxxxxxxxxx”,
[String]$BucketName = “abc11-abc-abcc”,
[String]$InputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt”,
[String]$OutputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
)
.\download_backup_file.ps1 -AccessKey ZVS9wSG8f -SecretKey ENOpkewOIRp7rpWidkV -BucketName abc11-abc-abc -InputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt” -OutputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
#>
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
try {
# IBM COS S3 Endpoint
$endpoint = “https://abc.abc.abc.com&#8221;
# Bucket Name
$bucket_name = “s3://” + $BucketName
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
### Use the below logic to download files if a text file is provided ###
if($InputPath -ne $null){
# Verifying the input path
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
$files_to_download = Get-Content $InputPath
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
### Use the below login to download if the file names are provided individually via “$FileNameToDownload” argument ###
if($FileNameToDownload -ne $null){
$files_to_download = $FileNameToDownload.split(“,”)
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
log $cmd
}
}

 

Click here to download my PowerShell scripts for Free !!

Azure – PowerShell in Azure Cloud Shell

Today we are looking into PowerShell in Azure Cloud Shell. This is still in public preview as of this writing.

If you are wondering why Microsoft would introduce a PowerShell console inside the Azure Cloud Shell, then have a look at the below features:

Features

Browser-based shell experience

Cloud Shell enables access to a browser-based command-line experience built with Azure management tasks in mind. Leverage Cloud Shell to work untethered from a local machine in a way only the cloud can provide.

Choice of preferred shell experience

Azure Cloud Shell gives you the flexibility of choosing the shell experience that best suits the way you work. Linux users can opt for a Bash experience, while Windows users can opt for PowerShell.

Pre-configured Azure workstation

Cloud Shell comes pre-installed with popular command-line tools and language support so you can work faster.
View the full tooling list for Bash experience and PowerShell experience.

Automatic authentication

Cloud Shell securely authenticates automatically on each session for instant access to your resources through the Azure CLI 2.0.

Connect your Azure File storage

Cloud Shell machines are temporary and as a result, require an Azure file share to be mounted as clouddrive to persist your $Home directory. On the first launch, Cloud Shell prompts to create a resource group, storage account, and file share on your behalf. This is a one-time step and will be automatically attached for all sessions. A single file share can be mapped and will be used by both Bash and PowerShell in Cloud Shell.

Below are some conditions that we have to remember:

Cloud Shell runs on a temporary machine provided on a per-session, per-user basis
Cloud Shell times out after 20 minutes without interactive activity
Cloud Shell can only be accessed with a file share attached
Cloud Shell uses the same file share for both Bash and PowerShell
Cloud Shell is assigned one machine per user account
Permissions are set as a regular Linux user (Bash)

Now that we have some background knowledge on the PowerShell in Cloud Shell, let us dig more into the usage of it:

To access the Cloud Shell, click on the PowerShell icon in the Azure portal:

image_1

Once you click on the icon, a pane is opened at the bottom of the screen as shown below. You can choose from two options – BASH or PowerShell. Since we are interested in learning PowerShell in CloudShell, let us choose PowerShell as our desired option.

image_2

When you are starting for the first time, the Shell will configure an Azure File Storage. Cloud Shell machines are temporary and as a result, require an Azure file share to be mounted as clouddrive to persist your $Home directory. Alternatively, if you have multiple subscriptions, you will be allowed to choose your favorite subscription to work with.

image_3

Azure Authentication, Resource Group, Storage Account and File Storage are automatically created as shown below:

image_4

Testing an Azure command. Works perfectly.

image_5

If you are idle for more than 20 minutes, you will be kicked off the session, and you will have to start the session again:

image_6

Discovering the drives under PowerShell in Cloud Shell:

Now let us execute the Get-ChildItem cmdlet and see what we can find.

image_8

As we can see, running the Get-ChildItem in the current scope will list out the subscriptions that your account is associated with.

Traversing one step deeper into the directory, we can see the resources related to the subscription.

image_9

Let us get into the “StroageAccounts” directory to confirm if we get to see a list of Storage Accounts under the selected subscription:

image_10

PowerShell cmdlets to manage PowerShell in Cloud Shell:

From the below information, we can see that Microsoft provides us two cmdlets to work with the cloud shell.

image_12

Get-CloudDrive provides the details of the “Azure File Share” that was created when the cloud shell started. You may continue to use the cloud share. However, if you want a new one, you can dismount and create a new one using the Dismount-CloudDrive cmdlet.

image_11

Note: Once you dismount the Azure file share, your current session will be restarted to set up a new cloud share.

Assumption:

I am assuming that Microsoft is using container service infrastructure to provide a session. You will get the below windows path when you query for the temp drive:

C:\Users\ContainerAdministrator\AppData\Local\Temp

image_11

Note the administrator is a “ContainerAdministrator.” The container here could be a Windows Server or a Windows Container. I am assuming it is a Windows Container since the underlying “image” comes pre-packaged with below tools and a temporary one. A typical use case scenario for Container technology.

image_13

 

If the content is valuable to you, do consider sharing it with your friends and colleagues.

Did I miss out anything? Let me know in the comments section.

 

Download my PowerShell scripts for Free!

 

Powershell – Script to check the Azure VHD lease status

The common miss conception while working with Azure compute is to assume that no billing charges will be incurred once the Azure VM is deleted. This is true to certain extent. Because, once you delete the VM, the billing for compute hours will stop. But the billing continues for the VHD (which was previously associated with the VM) that is still available in the Azure storage account.

As the title of the post states – the idea behind this script is to get a list of “Lease status” of Azure VHDs from all the storage accounts under your subscription. This is particularly helpful to delete any unused VHDs. Thus saving a lot of money for your organization.

The complete script is uploaded in the Microsoft Script Center. Use the below link to download it.

Check the Lease status of VHDs

 

 

Powershell – Script to Monitor Azure VM Availabilty

The idea behind writing this script is to have an automated solution to monitor availability of any Azure VMs. The script fetches the current server status, saves it in an Azure Table. Each script execution is one poll. So the second time the script runs, it fetches the current server status of VMs and then compares it to the previous value. If there are any changes to the server status during polling, such server details will be written to a hash table. Finally the details of the servers can be sent to an email.

Since we are monitoring the VM status from “RUNNING” to “VM STOPPED”, this will eliminate the scenarios, where VMs are stopped manually or as per a scheduled shutdown automation script. In these cases the VM status changes from “RUNNING” to “VM DEALLOCATED”.

Feel free to customize the script to add logic if you want to monitor the status of De-allocated VMs as well.

This script uses SendGrid as an email server. Feel free to add your SMTP address if you have one.

This script is useful when you do not yet have a fully automated monitoring like Nagios/OMS. Maybe you have a couple of servers that you want to monitor and do not want to spend more money on a custom monitoring. Simply create a runbook using this script as a baseline and schedule it in the Azure Automation Account.

The script is uploaded to the Microsoft Script Center. Please download it using the below link:

Monitor Azure VM Availability