PowerShell – List Azure Backup Items

Azure Backup is the Azure-based service you can use to back up (or protect) and restore your data in the Microsoft cloud. Azure Backup replaces your existing on-premises or off-site backup solution with a cloud-based solution that is reliable, secure, and cost-competitive.

Azure Backup offers multiple components that you download and deploy on the appropriate computer, server, or in the cloud. The component, or agent that you deploy depends on what you want to protect.

All Azure Backup components (no matter whether you’re protecting data on-premises or in the cloud) can be used to back up data to a Recovery Services vault in Azure.

You might come across a need to automate the process of generating a report every day and share it with stakeholders to keep track of your backup details.

The PowerShell script will list the “Backup Items” from your Azure subscription. And saves the data into an excel file under the folder  “C:\Backup_job_report.” The excel file will contain multiple worksheets for each “Vault” that exists. The script expects you to provide a text file containing the list of Azure Servers, for which you want to fetch the “Backup Items.”

The details include:

1. VM Resource Name

2. VM Name

3. Recovery Vault Name

4. Last Backup Status

5. Latest Recovery Point

The script is uploaded to Microsoft Technet Script Center’s repository:

List Azure Backup Items using Powershell

You can obtain this information from the Azure portal. Please traverse as shown below:

Sign in to Azure Portal >> Search and select “Recovery Services vaults” >> Select a vault >> Click on “Backup Items” under protected items >> Click on “Azure Virtual Machines”.

 

Azure

 

Click here to download my PowerShell scripts for Free !!

 

 

Advertisements

IBM SoftLayer – List, Sync, and Download data from IBM COS S3 bucket

This blog can be treated as an extension to my blog on “Delete IBM COS S3 files older than X days”

Today I shall be sharing PowerShell scripts to List, Sync and Download data from IBM COS S3 bucket.

Pre-requisite is to have the AWS CLI modules installed on the machine.

List all the objects in a bucket

  1. The script will expect AccessKey, SecretKey, and BucketName.
  2. It creates a log file to log the flow of the script.
  3. Make sure you set the endpoint to $endpoint variable. [This can also be made as a parameter to make the script more dynamic]
  4. The script then dynamically sets the “AWS CLI profile” with profile name “test.”
  5. Execute the AWS CLI command, and un-set the profile.
<#
This script will list all the objects in a bucket.
EXAMPLE:
.\list_backup_files.ps1 -AccessKey ‘ZVS9wEvUUYSG8f’ -SecretKey ‘ENOpkewRzAoHGWnvulL1KbNOIRp7rpWidkV’ -BucketName ‘abc-abc-abc’
#>
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abc.objstor.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Listing files from IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “”
aws –endpoint-url $endpoint s3 ls $bucket_name –recursive –human-readable –summarize –output json –profile test
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
Write-Host “Command that failed was: ” $cmd
}
}

 

Sync files between local location and IBM COS S3 bucket

The script expects a local directory path which has to be synced with the IBM COS S3 bucket.

<#
This script will list all the objects in a bucket.
EXAMPLE:
.\sync_from_local_to_s3.ps1 -AccessKey ZVS9wEvUUYSG8f -SecretKey ENOpkewRzAoHGWnrpWidkV -BucketName abc-abc-abcc -InputPath “C:\manju”
#>
param(
[String]$AccessKey = ”,
[String]$SecretKey = ”,
[String]$BucketName = ”,
[String]$InputPath = “”
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\sync_from_local_to_s3_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abcc.abccc.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Sync files between local path and IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “aws –endpoint-url $endpoint s3 sync $InputPath $BucketName –profile test”
aws –endpoint-url $endpoint s3 sync $InputPath s3://abc11-abc-abc –profile test ## Hardcoding the bucket name. Expects <S3Uri>. Throws errror if BucketName is passed as string.
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
$cmd | out-file $logfile -Append
}
}

 

Download files from IBM COS S3 bucket to local system

This script expects

  1. InputPath. This is a text file, that contains list of file names to be downloaded.
  2. OutputPath. This is an absolute file path where files will be downloaded
  3. FileNameToDownload. You can alternatively specify file name (comma seperated) to be downloaded. This is useful when you have 3-4 files to be downloaded. Providing file names is easier than creating a text file for a small number of files.
# This script will download backup files form IBM COS S3 bucket.
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName,
[String]$InputPath, # Contains list of file names to download
[String]$OutputPath, # Files will be downloaded to this folder
[String]$FileNameToDownload
)
<#
EXAMPLE:
param(
[String]$AccessKey = “xxxxxxxxxxxxxxxxxxxxxxxxxx”,
[String]$SecretKey = “xxxxxxxxxxxxxxxxxxxxxx”,
[String]$BucketName = “abc11-abc-abcc”,
[String]$InputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt”,
[String]$OutputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
)
.\download_backup_file.ps1 -AccessKey ZVS9wSG8f -SecretKey ENOpkewOIRp7rpWidkV -BucketName abc11-abc-abc -InputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt” -OutputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
#>
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
try {
# IBM COS S3 Endpoint
$endpoint = “https://abc.abc.abc.com&#8221;
# Bucket Name
$bucket_name = “s3://” + $BucketName
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
### Use the below logic to download files if a text file is provided ###
if($InputPath -ne $null){
# Verifying the input path
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
$files_to_download = Get-Content $InputPath
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
### Use the below login to download if the file names are provided individually via “$FileNameToDownload” argument ###
if($FileNameToDownload -ne $null){
$files_to_download = $FileNameToDownload.split(“,”)
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
log $cmd
}
}

 

Click here to download my PowerShell scripts for Free !!

PowerShell – Delete IBM Softlayer COS S3 files older than X days

IBM Softlayer uses many types of storage to store the data. One of which is Amazon S3. I have written a simple PowerShell script to delete IBM COS S3 files older than X days.

The PS Script uses AWS CLI commands so you will need AWS CLI installed on the windows machine from where you will run this script. The script can also be scheduled as a task to run every day.

Delete IBM COS S3 files older than X days

Click here to download my PowerShell scripts for Free !!

 

Azure – First look into Cloudlyn (Azure’s Cost management service)

This is a continuation of my first blog on Cloudlyn. Below is the link to my first blog. It guides you to register for Cloudlyn if you have an Azure subscription.

AZURE – COST MANAGEMENT BETTER THAN EVER USING CLOUDYN (REGISTRATION)

In this post, let us take a first-look into Cloudlyn’s Cost management console and the features Cloudlyn offers.

Once you have completed the registration with Cloudlyn as explained in my first blog (link shared above). You can access the Cloudlyn’s Cost management portal by:

  1. Log into Azure portal
  2. Navigate to “Cost Management + Billing.”
  3. Click on “Go to Cost Management” button
  4. A new window will be opened with an URL – https://azure.cloudyn.com/dashboard#/tool/enterprise_dashboard

Once you navigate to the URL, below is the page you see. This is Cloudlyn’s Management Dashboard.

image_7

 

Below is the Annual Projected Annual cost. I have used only Azure Storage and Azure Network. If you have used more services, their costs will also be projected here.

image_8

 

We can also view Current and Previous Month Projected Cost. This is very useful to track changes in infrastructure costing more than usual.

image_9

 

The “Actual Cost Analysis” will provide cost by Services, Providers, and Accounts. This graph can be further customized. Groups (highlighted below) provides plenty of options that we can check as per our requirement. Finally, you can choose an option from many of the Actions (highlighted below) on what to do with the report. Save / Copy  / Export etc.,

image_10

 

The “Actual Cost Over Time” lets you pull up a report to analyze the cost over a range of time.

image_11

 

Cloudlyn also offers an “Alert Management” feature that alerts you when certain thresholds are crossed as per alert’s configuration.

image_13

 

“Cost vs. Threshold” is one of many alerts. As you can see Cloudlyn offers you many options to customize the alert policy. Alerts will be sent to an email ID.

image_12

 

Finally, the last noticeable feature is “Data Transfer.” You can either Analyze data transfer usage or view the trend of data transfer.

image_14

 

These are the noticeable features that I felt to be very valuable.

If you found any other feature to be worth mentioning, let me know in the comments section.

If the content of this blog is valuable to you, do consider sharing with your friends and colleagues.

Click here to download my PowerShell scripts for Free !!

Azure – Cost Management Better Than Ever Using Cloudyn (Registration)

Microsoft’s acquisition of Cloudyn will help Azure customers manage and optimize their cloud usage. Read more about the acquisition here. A message from Sharon Wagner, CEO of Cloudyn.

About Cloudyn

Azure Cost Management by Cloudyn empowers organizations to monitor cloud spend, drive organizational accountability, and optimize cloud efficiency so they can accelerate future cloud investments with confidence.

Microsoft’s acquisition of Cloudyn will help Azure customers and partners as they face the challenges of growing their multi-cloud environments. It will enable them to gain visibility, understand and optimize cloud consumption, as well as accurately project future usage.

Microsoft will continue to support multi-cloud environments, including Azure, AWS, and GCP. Azure Cost Management by Cloudyn is available for free to customers and partners managing Azure spend. Additional premium capabilities are available at no cost through June 2018, once they will become paid features.

Let us look into how to sign up to Cloudyn if you are an Azure customer

Step 1: Login to your Azure Subscription via the Azure portal. Select the “Cost Management + Billing” blade. Then select “Cost Management” from the options on the left-hand side of the pane.

Click on “Go to Cost Management”

image_1

 

Step 2: Once you click the”Go to Cost Management” button, you will be redirected to  Cloudlyn’s page to set up your Cost Management details.

Enter your organization name and the type of Azure access you have on your Azure account. I have a personal subscription, so I have chosen as “Azure Individual Subscription Owner.”

image_2

 

Step 3: Cloudlyn Account name and Tenant ID will be automatically populated. Now, select the offer-ID from the drop-down list.

If you do not know your Offer-ID, then go back to your Azure portal. Click on “Subscriptions,” that should provide you the type of subscription that you have.

image3_1

 

Step 4: Click “Next”

image_4

 

Step 5: Click “Next”

image_5

Step 6: We are done with the registration with Cloudlyn. Cloudlyn needs about 2 hours for collecting the data.

image_6

 

If you find the content valuable, do consider sharing with your friends and colleagues.

Click here to download my PowerShell scripts for Free !!

AWS – Configure Point-To-Site in AWS envirnonment using OpenVPN

This document illustrates how we can create a VPN using the AMI – OpenVPN which is available in AWS MarketPlace.

Below shows the architecture that we would be able to achieve:

image1

We will set up an AWS VPC where we will launch our servers. One of the servers will be launched using the OpenVPN AMI. The other servers will be on public and private subnets within the VPC.

Steps to Achieve are as below:

Step 1: Login into AWS Console and choose VPC.

Step 2: Select Service – VPC

Step 3: Create a VPC. Provide the Name Tag and CIDR details. Also, create the subnet under this VPC.

image2

Step 4: Now, Let us go back to EC2 Service and then come back to VPC Later.

Step 5: In EC2 Service, Launch an EC2 Server and go to AWS Marketplace and choose the OpenVPN AMI. You can assign an elastic IP if necessary

image3

Step 6: Launch this instance with all the other necessary configuration. This AMI comes with default sec groups, keep them default and launch it. Note – This is not necessary to be launched in the VPC Created.

Step 7: Launch two more instance one with public IP and the other with private IP under the VPC created.

Step 8: Now let us go back to VPC. We will need to create Internet Gateway and RouteTable. Click on VPC in the console and go to Internet Gateway. Now click on Create Internet Gateway and provide a suitable Name tag. Attached this Gateway to the VPC we created earlier using the option “Attach to VPC”

image4

Step 9: Now, go to Route table and click on create route table. Update the Routes to have 0.0.0.0/0 (opens to the internet) and specify the IGW (internet gateway) which was created.

image5

Step 10: Update the Subnet Association to the

image6

Step 11: This completes the configuration on the AWS Services. Now we will need to configure the OpenVPN.

Step 12: SSH to the OpenVPN box using the username – openvpnas

Step 13: You will need to agree to the terms by typing “Yes”. Keep all the other setting default by just pushing the “Enter” button. By Default, the username will be “openvpn”. You will need to update the password using the command as per the below screenshot: sudo passwd openvpn.

image7_1

Step 14: Now go to the browser and type the IP address of the OpenVPN box. Ex: https://ipaddress

Step 15: Login in with the password updated earlier on the terminal.

image8

Step 16: Download the OpenVPN client as per the operating system. Install the client.

image9

Step 17: Click on the icon and select the OpenVPN server and click on connect. Once connected you will be able to connect to the servers within the vpc.

image10

Step 18: Now let us ping the Public IP and see the result.We were successful in pinging the Public Server under the VPC.

image11_1

Step 19: Now let us disconnect the OpenVPN and try to connect to the private server.

image12

Step 20: Now we see the ping requests are getting timed out. We are unable to reach the server using the private IP under the VPC.

image13_1

Step 21: Let us connect the VPN using the OpenVPN and try to ping the same Private Server.

image14_1

Looking for free PowerShell scripts? Check out my Powershell Contributions under Microsoft Technet Script Centre

Azure – PowerShell in Azure Cloud Shell

Today we are looking into PowerShell in Azure Cloud Shell. This is still in public preview as of this writing.

If you are wondering why Microsoft would introduce a PowerShell console inside the Azure Cloud Shell, then have a look at the below features:

Features

Browser-based shell experience

Cloud Shell enables access to a browser-based command-line experience built with Azure management tasks in mind. Leverage Cloud Shell to work untethered from a local machine in a way only the cloud can provide.

Choice of preferred shell experience

Azure Cloud Shell gives you the flexibility of choosing the shell experience that best suits the way you work. Linux users can opt for a Bash experience, while Windows users can opt for PowerShell.

Pre-configured Azure workstation

Cloud Shell comes pre-installed with popular command-line tools and language support so you can work faster.
View the full tooling list for Bash experience and PowerShell experience.

Automatic authentication

Cloud Shell securely authenticates automatically on each session for instant access to your resources through the Azure CLI 2.0.

Connect your Azure File storage

Cloud Shell machines are temporary and as a result, require an Azure file share to be mounted as clouddrive to persist your $Home directory. On the first launch, Cloud Shell prompts to create a resource group, storage account, and file share on your behalf. This is a one-time step and will be automatically attached for all sessions. A single file share can be mapped and will be used by both Bash and PowerShell in Cloud Shell.

Below are some conditions that we have to remember:

Cloud Shell runs on a temporary machine provided on a per-session, per-user basis
Cloud Shell times out after 20 minutes without interactive activity
Cloud Shell can only be accessed with a file share attached
Cloud Shell uses the same file share for both Bash and PowerShell
Cloud Shell is assigned one machine per user account
Permissions are set as a regular Linux user (Bash)

Now that we have some background knowledge on the PowerShell in Cloud Shell, let us dig more into the usage of it:

To access the Cloud Shell, click on the PowerShell icon in the Azure portal:

image_1

Once you click on the icon, a pane is opened at the bottom of the screen as shown below. You can choose from two options – BASH or PowerShell. Since we are interested in learning PowerShell in CloudShell, let us choose PowerShell as our desired option.

image_2

When you are starting for the first time, the Shell will configure an Azure File Storage. Cloud Shell machines are temporary and as a result, require an Azure file share to be mounted as clouddrive to persist your $Home directory. Alternatively, if you have multiple subscriptions, you will be allowed to choose your favorite subscription to work with.

image_3

Azure Authentication, Resource Group, Storage Account and File Storage are automatically created as shown below:

image_4

Testing an Azure command. Works perfectly.

image_5

If you are idle for more than 20 minutes, you will be kicked off the session, and you will have to start the session again:

image_6

Discovering the drives under PowerShell in Cloud Shell:

Now let us execute the Get-ChildItem cmdlet and see what we can find.

image_8

As we can see, running the Get-ChildItem in the current scope will list out the subscriptions that your account is associated with.

Traversing one step deeper into the directory, we can see the resources related to the subscription.

image_9

Let us get into the “StroageAccounts” directory to confirm if we get to see a list of Storage Accounts under the selected subscription:

image_10

PowerShell cmdlets to manage PowerShell in Cloud Shell:

From the below information, we can see that Microsoft provides us two cmdlets to work with the cloud shell.

image_12

Get-CloudDrive provides the details of the “Azure File Share” that was created when the cloud shell started. You may continue to use the cloud share. However, if you want a new one, you can dismount and create a new one using the Dismount-CloudDrive cmdlet.

image_11

Note: Once you dismount the Azure file share, your current session will be restarted to set up a new cloud share.

Assumption:

I am assuming that Microsoft is using container service infrastructure to provide a session. You will get the below windows path when you query for the temp drive:

C:\Users\ContainerAdministrator\AppData\Local\Temp

image_11

Note the administrator is a “ContainerAdministrator.” The container here could be a Windows Server or a Windows Container. I am assuming it is a Windows Container since the underlying “image” comes pre-packaged with below tools and a temporary one. A typical use case scenario for Container technology.

image_13

 

If the content is valuable to you, do consider sharing it with your friends and colleagues.

Did I miss out anything? Let me know in the comments section.

 

Download my PowerShell scripts for Free!