Author: Manjunath

Manjunath is an IT enthusiast with experience in Cloud services platform and PowerShlell scripting. He blogs about public cloud platform and scripting techniques.

Powershell – Extract user list from Azure Active Directory to an excel file

This script will authenticate to your Azure Active Directory and fetch all the user details. Finally, it will save the details to the excel sheet.

Below is the link to the script:

https://gallery.technet.microsoft.com/scriptcenter/Extract-user-list-from-6cb9a93c

Below are the user attributes the script fetches:

1. Display Name

2. Object ID

3. Type

4. Principal Name

5. Role Name

6. Role Description

The excel sheet is saved as: C:\AzureADUserList\AzureADUserList.xlsx

Pre-Requisites: This script needs ‘MSOnline’ and ‘AzureRM’ PowerShell modules

Click here to download my PowerShell scripts for Free !!

 

 

Advertisements

PowerShell – Generate Azure PaaS Inventory

This PowerShell script helps you to maintain an inventory sheet of your Azure Platform-As-A-Service services. So that you can refer to them anytime you want.

Also, it serves a quick way to generate a report when your client needs to have a quick look at their PaaS services.

Below is the flow

  • The script logins to your Azure account and fetches the details of your Azure PaaS resources – Azure CDN and Azure WebApps.
  • It creates one worksheet for each Azure resource.
  • The user is prompted to select the subscription.Powershell Exception handling is implemented.
  • The user is again prompted if he wishes to view the excel sheet once the script is finished running.

** The script assumes you are using Powershell v5.0 and have excel module. (Basically, you should have MSOffice installed)

Below is the link to the script:

https://gallery.technet.microsoft.com/scriptcenter/Azure-PaaS-inventory-using-d1872989

I have also written script to generate:

Azure IaaS Inventory

AWS IaaS Inventory

Click here to download my PowerShell scripts for Free !!

 

 

PowerShell – Install Nagios client on a remote Windows Server

As Windows Administrators, we need to install many tools on the Windows Server as part of the onboarding process. One such critical tool is Nagios, used for monitoring servers.

The onboarding process takes a heavy toll on the on-boarding enginers when we are on-boarding new client. This is because we have to install tools on 50-100s of servers. Manual installation of each installation will take ~20 minutes depending on the configuration.

The best solution to remove manual effort, human error and to increase ROI is to automate the process.

I have written a PowerShell script that does just that. You can download the script from my Microsoft Script Center repository:

Install Nagios client on a remote windows server

 

The script will install Nagios client to a remote server. It copies the MSI and the INI file to the remote computer’s C drive and then executes it. Once the execution is completed, it will copy the “nsclient.ini” file to the installed folder.

Pre-Requisites:

– The servers are to be domain joined.

– Powershell remoting to be enabled on both servers.

Next Steps:

You can enhance the script, that accepts server list and executes against all the servers.

Tested on: Windows Server 2012 R2

Note: You may have to edit the script if you are changing the name of the MSI file. The script uses: NSCP-0.4.4.19-x64.msi

 

Click here to download my PowerShell scripts for Free !!

 

 

PowerShell – List Azure Backup Items

Azure Backup is the Azure-based service you can use to back up (or protect) and restore your data in the Microsoft cloud. Azure Backup replaces your existing on-premises or off-site backup solution with a cloud-based solution that is reliable, secure, and cost-competitive.

Azure Backup offers multiple components that you download and deploy on the appropriate computer, server, or in the cloud. The component, or agent that you deploy depends on what you want to protect.

All Azure Backup components (no matter whether you’re protecting data on-premises or in the cloud) can be used to back up data to a Recovery Services vault in Azure.

You might come across a need to automate the process of generating a report every day and share it with stakeholders to keep track of your backup details.

The PowerShell script will list the “Backup Items” from your Azure subscription. And saves the data into an excel file under the folder  “C:\Backup_job_report.” The excel file will contain multiple worksheets for each “Vault” that exists. The script expects you to provide a text file containing the list of Azure Servers, for which you want to fetch the “Backup Items.”

The details include:

1. VM Resource Name

2. VM Name

3. Recovery Vault Name

4. Last Backup Status

5. Latest Recovery Point

The script is uploaded to Microsoft Technet Script Center’s repository:

List Azure Backup Items using Powershell

You can obtain this information from the Azure portal. Please traverse as shown below:

Sign in to Azure Portal >> Search and select “Recovery Services vaults” >> Select a vault >> Click on “Backup Items” under protected items >> Click on “Azure Virtual Machines”.

 

Azure

 

Click here to download my PowerShell scripts for Free !!

 

 

IBM SoftLayer – List, Sync, and Download data from IBM COS S3 bucket

This blog can be treated as an extension to my blog on “Delete IBM COS S3 files older than X days”

Today I shall be sharing PowerShell scripts to List, Sync and Download data from IBM COS S3 bucket.

Pre-requisite is to have the AWS CLI modules installed on the machine.

List all the objects in a bucket

  1. The script will expect AccessKey, SecretKey, and BucketName.
  2. It creates a log file to log the flow of the script.
  3. Make sure you set the endpoint to $endpoint variable. [This can also be made as a parameter to make the script more dynamic]
  4. The script then dynamically sets the “AWS CLI profile” with profile name “test.”
  5. Execute the AWS CLI command, and un-set the profile.
<#
This script will list all the objects in a bucket.
EXAMPLE:
.\list_backup_files.ps1 -AccessKey ‘ZVS9wEvUUYSG8f’ -SecretKey ‘ENOpkewRzAoHGWnvulL1KbNOIRp7rpWidkV’ -BucketName ‘abc-abc-abc’
#>
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abc.objstor.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Listing files from IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “”
aws –endpoint-url $endpoint s3 ls $bucket_name –recursive –human-readable –summarize –output json –profile test
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
Write-Host “Command that failed was: ” $cmd
}
}

 

Sync files between local location and IBM COS S3 bucket

The script expects a local directory path which has to be synced with the IBM COS S3 bucket.

<#
This script will list all the objects in a bucket.
EXAMPLE:
.\sync_from_local_to_s3.ps1 -AccessKey ZVS9wEvUUYSG8f -SecretKey ENOpkewRzAoHGWnrpWidkV -BucketName abc-abc-abcc -InputPath “C:\manju”
#>
param(
[String]$AccessKey = ”,
[String]$SecretKey = ”,
[String]$BucketName = ”,
[String]$InputPath = “”
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\sync_from_local_to_s3_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abcc.abccc.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Sync files between local path and IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “aws –endpoint-url $endpoint s3 sync $InputPath $BucketName –profile test”
aws –endpoint-url $endpoint s3 sync $InputPath s3://abc11-abc-abc –profile test ## Hardcoding the bucket name. Expects <S3Uri>. Throws errror if BucketName is passed as string.
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
$cmd | out-file $logfile -Append
}
}

 

Download files from IBM COS S3 bucket to local system

This script expects

  1. InputPath. This is a text file, that contains list of file names to be downloaded.
  2. OutputPath. This is an absolute file path where files will be downloaded
  3. FileNameToDownload. You can alternatively specify file name (comma seperated) to be downloaded. This is useful when you have 3-4 files to be downloaded. Providing file names is easier than creating a text file for a small number of files.
# This script will download backup files form IBM COS S3 bucket.
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName,
[String]$InputPath, # Contains list of file names to download
[String]$OutputPath, # Files will be downloaded to this folder
[String]$FileNameToDownload
)
<#
EXAMPLE:
param(
[String]$AccessKey = “xxxxxxxxxxxxxxxxxxxxxxxxxx”,
[String]$SecretKey = “xxxxxxxxxxxxxxxxxxxxxx”,
[String]$BucketName = “abc11-abc-abcc”,
[String]$InputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt”,
[String]$OutputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
)
.\download_backup_file.ps1 -AccessKey ZVS9wSG8f -SecretKey ENOpkewOIRp7rpWidkV -BucketName abc11-abc-abc -InputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt” -OutputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
#>
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
try {
# IBM COS S3 Endpoint
$endpoint = “https://abc.abc.abc.com&#8221;
# Bucket Name
$bucket_name = “s3://” + $BucketName
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
### Use the below logic to download files if a text file is provided ###
if($InputPath -ne $null){
# Verifying the input path
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
$files_to_download = Get-Content $InputPath
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
### Use the below login to download if the file names are provided individually via “$FileNameToDownload” argument ###
if($FileNameToDownload -ne $null){
$files_to_download = $FileNameToDownload.split(“,”)
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
log $cmd
}
}

 

Click here to download my PowerShell scripts for Free !!

PowerShell – Delete IBM Softlayer COS S3 files older than X days

IBM Softlayer uses many types of storage to store the data. One of which is Amazon S3. I have written a simple PowerShell script to delete IBM COS S3 files older than X days.

The PS Script uses AWS CLI commands so you will need AWS CLI installed on the windows machine from where you will run this script. The script can also be scheduled as a task to run every day.

Delete IBM COS S3 files older than X days

Click here to download my PowerShell scripts for Free !!

 

Azure – First look into Cloudlyn (Azure’s Cost management service)

This is a continuation of my first blog on Cloudlyn. Below is the link to my first blog. It guides you to register for Cloudlyn if you have an Azure subscription.

AZURE – COST MANAGEMENT BETTER THAN EVER USING CLOUDYN (REGISTRATION)

In this post, let us take a first-look into Cloudlyn’s Cost management console and the features Cloudlyn offers.

Once you have completed the registration with Cloudlyn as explained in my first blog (link shared above). You can access the Cloudlyn’s Cost management portal by:

  1. Log into Azure portal
  2. Navigate to “Cost Management + Billing.”
  3. Click on “Go to Cost Management” button
  4. A new window will be opened with an URL – https://azure.cloudyn.com/dashboard#/tool/enterprise_dashboard

Once you navigate to the URL, below is the page you see. This is Cloudlyn’s Management Dashboard.

image_7

 

Below is the Annual Projected Annual cost. I have used only Azure Storage and Azure Network. If you have used more services, their costs will also be projected here.

image_8

 

We can also view Current and Previous Month Projected Cost. This is very useful to track changes in infrastructure costing more than usual.

image_9

 

The “Actual Cost Analysis” will provide cost by Services, Providers, and Accounts. This graph can be further customized. Groups (highlighted below) provides plenty of options that we can check as per our requirement. Finally, you can choose an option from many of the Actions (highlighted below) on what to do with the report. Save / Copy  / Export etc.,

image_10

 

The “Actual Cost Over Time” lets you pull up a report to analyze the cost over a range of time.

image_11

 

Cloudlyn also offers an “Alert Management” feature that alerts you when certain thresholds are crossed as per alert’s configuration.

image_13

 

“Cost vs. Threshold” is one of many alerts. As you can see Cloudlyn offers you many options to customize the alert policy. Alerts will be sent to an email ID.

image_12

 

Finally, the last noticeable feature is “Data Transfer.” You can either Analyze data transfer usage or view the trend of data transfer.

image_14

 

These are the noticeable features that I felt to be very valuable.

If you found any other feature to be worth mentioning, let me know in the comments section.

If the content of this blog is valuable to you, do consider sharing with your friends and colleagues.

Click here to download my PowerShell scripts for Free !!