Azure – Server Inventory solution

This blog post is dedicated to IT Operations team and administrators who are managing Cloud Infrastructure. The recommended practice while providing managed service to any client is to have a CMDB (Configuration Management Database), which tracks the list of servers and the corresponding details, that we are managing for the client.

However, considering the dynamic nature of the cloud environment, it is a difficult task to maintain such a database. Manually updating the list of servers/server inventory is tedious and error-prone. The only solution is to have an automated approach to this problem.

Below is my solution:

The PowerShell script will extract virtual machines and their details. In this particular case, the script will consider virtual machines, which has tags (‘owner’,’Manju’). That is, I want to manage virtual machines owned only by me. You can go ahead and make changes to the script if you have a different requirement.

Next, the script will write the data into an Azure table. Remember, that the Azure table has to be created before running the script. Another option is Azure Cosmos DB.

Next, you can upload this script to your Azure Automation account or a dedicated windows server. Then, schedule this script to run every one hour to track your server inventory.

The script uses cmdlets from the “AzureRmStorageTable” PowerShell module.

Execute “Install-Module AzureRmStorageTable” to install the module.

Note: You have to alter the script when you schedule the script. The login mechanism is different for “Azure Automation” and “Task scheduler via Windows server”. The login mechanism of the below script is to execute it directly (manually) from PowerShell console or PowerShell ISE.

 

Script:

# Author: Manjunath Rao
# Date: Febuary 13, 2018

# Install-Module AzureRmStorageTable –>> THIS MODULE NEEDED

# Login to Azure
Login-AzureRmAccount
## Code to create Azure table storage context
$azure_table_storage_account_name = “xxx”
$azure_table_name = “xxx”
$azure_table_partitionKey = “xxx”
$azure_table_rowkey = “xxx”

$azure_table_resource_group = “xxx”

$storage_account_context = (Get-AzureRmStorageAccount -ResourceGroupName $azure_table_resource_group -Name $azure_table_storage_account_name).Context

$azure_table_object = Get-AzureStorageTable -Name $azure_table_name -Context $storage_account_context

############################################

# Getting all the resource group
$resource_group_list = Get-AzureRmResourceGroup

# Iterating through the resource group
foreach($resource_group_list_iterator in $resource_group_list){

# Since the solution applies for virtual machines,
# obtain the list of virtual machines for the resource group
$virtual_machine_list = get-azurermvm -ResourceGroupName $resource_group_list_iterator.ResourceGroupName

# Proceed only when resource group contains virtual machines
if(!($virtual_machine_list -eq $null)){

# Iterate through the virtual machine list
foreach($virtual_machine_list_iterator in $virtual_machine_list){

# Creat an unique ID by concatinating ‘Resource Group name’ and ‘Virtual Machine name’
$unique_id = $resource_group_list_iterator.ResourceGroupName + $virtual_machine_list_iterator.name
#Write-Host $unique_id
$tag_list = $virtual_machine_list_iterator.Tags

$tag_list.GetEnumerator() | foreach {
#write-host $_.key
#Write-Host $_.value
#write-host “”

$partitionKey1 = $unique_id

if($_.key -eq ‘owner’ -and $_.value -eq ‘manju’) {
#write-host “true”
$virtual_machine_name = $virtual_machine_list_iterator.Name.ToString()
$virtual_machine_resource_group_name = $resource_group_list_iterator.ResourceGroupName.ToString()
$virtual_machine_location = $virtual_machine_list_iterator.Location.ToString()
$virtual_machine_size = $virtual_machine_list_iterator.HardwareProfile.VmSize.ToString()
$virtual_machine_operating_system = $virtual_machine_list_iterator.StorageProfile.ImageReference.Offer.ToString()

 

$hash = @{}
#$hash.add(‘currentDate’, $current_date)
$hash.Add(‘VMName’,$virtual_machine_resource_group_name)
$hash.Add(‘ResourceGroup’,$virtual_machine_resource_group_name)
$hash.add(‘Location’,$virtual_machine_location)
$hash.add(‘VMSize’,$virtual_machine_size)
$hash.add(‘OperatingSystem’,$virtual_machine_operating_system)

# Write data into azure table
Add-StorageTableRow -table $azure_table_object -partitionKey (“CA1”) -rowKey ([guid]::NewGuid().tostring()) -property $hash

}
}

}

}

}

 

On the other hand, if you would like to fetch inventory details, and just save it in an excel sheet, I have the perfect scripts that do the job for you:

https://manjunathrao.com/2017/12/04/powershell-generte-azure-paas-inventory/

https://manjunathrao.com/2016/12/30/powershell-generate-azure-inventory/

https://manjunathrao.com/2017/04/06/powershell-generate-aws-inventory/

 

Click here to download my PowerShell scripts for Free !!

 

 

 

Powershell – Extract user list from Azure Active Directory to an excel file

This script will authenticate to your Azure Active Directory and fetch all the user details. Finally, it will save the details to the excel sheet.

Below is the link to the script:

https://gallery.technet.microsoft.com/scriptcenter/Extract-user-list-from-6cb9a93c

Below are the user attributes the script fetches:

1. Display Name

2. Object ID

3. Type

4. Principal Name

5. Role Name

6. Role Description

The excel sheet is saved as: C:\AzureADUserList\AzureADUserList.xlsx

Pre-Requisites: This script needs ‘MSOnline’ and ‘AzureRM’ PowerShell modules

Click here to download my PowerShell scripts for Free !!

 

 

PowerShell – Generate Azure PaaS Inventory

This PowerShell script helps you to maintain an inventory sheet of your Azure Platform-As-A-Service services. So that you can refer to them anytime you want.

Also, it serves a quick way to generate a report when your client needs to have a quick look at their PaaS services.

Below is the flow

  • The script logins to your Azure account and fetches the details of your Azure PaaS resources – Azure CDN and Azure WebApps.
  • It creates one worksheet for each Azure resource.
  • The user is prompted to select the subscription.Powershell Exception handling is implemented.
  • The user is again prompted if he wishes to view the excel sheet once the script is finished running.

** The script assumes you are using Powershell v5.0 and have excel module. (Basically, you should have MSOffice installed)

Below is the link to the script:

https://gallery.technet.microsoft.com/scriptcenter/Azure-PaaS-inventory-using-d1872989

I have also written script to generate:

Azure IaaS Inventory

AWS IaaS Inventory

Click here to download my PowerShell scripts for Free !!

 

 

PowerShell – Install Nagios client on a remote Windows Server

As Windows Administrators, we need to install many tools on the Windows Server as part of the onboarding process. One such critical tool is Nagios, used for monitoring servers.

The onboarding process takes a heavy toll on the on-boarding enginers when we are on-boarding new client. This is because we have to install tools on 50-100s of servers. Manual installation of each installation will take ~20 minutes depending on the configuration.

The best solution to remove manual effort, human error and to increase ROI is to automate the process.

I have written a PowerShell script that does just that. You can download the script from my Microsoft Script Center repository:

Install Nagios client on a remote windows server

 

The script will install Nagios client to a remote server. It copies the MSI and the INI file to the remote computer’s C drive and then executes it. Once the execution is completed, it will copy the “nsclient.ini” file to the installed folder.

Pre-Requisites:

– The servers are to be domain joined.

– Powershell remoting to be enabled on both servers.

Next Steps:

You can enhance the script, that accepts server list and executes against all the servers.

Tested on: Windows Server 2012 R2

Note: You may have to edit the script if you are changing the name of the MSI file. The script uses: NSCP-0.4.4.19-x64.msi

 

Click here to download my PowerShell scripts for Free !!

 

 

PowerShell – List Azure Backup Items

Azure Backup is the Azure-based service you can use to back up (or protect) and restore your data in the Microsoft cloud. Azure Backup replaces your existing on-premises or off-site backup solution with a cloud-based solution that is reliable, secure, and cost-competitive.

Azure Backup offers multiple components that you download and deploy on the appropriate computer, server, or in the cloud. The component, or agent that you deploy depends on what you want to protect.

All Azure Backup components (no matter whether you’re protecting data on-premises or in the cloud) can be used to back up data to a Recovery Services vault in Azure.

You might come across a need to automate the process of generating a report every day and share it with stakeholders to keep track of your backup details.

The PowerShell script will list the “Backup Items” from your Azure subscription. And saves the data into an excel file under the folder  “C:\Backup_job_report.” The excel file will contain multiple worksheets for each “Vault” that exists. The script expects you to provide a text file containing the list of Azure Servers, for which you want to fetch the “Backup Items.”

The details include:

1. VM Resource Name

2. VM Name

3. Recovery Vault Name

4. Last Backup Status

5. Latest Recovery Point

The script is uploaded to Microsoft Technet Script Center’s repository:

List Azure Backup Items using Powershell

You can obtain this information from the Azure portal. Please traverse as shown below:

Sign in to Azure Portal >> Search and select “Recovery Services vaults” >> Select a vault >> Click on “Backup Items” under protected items >> Click on “Azure Virtual Machines”.

 

Azure

 

Click here to download my PowerShell scripts for Free !!

 

 

IBM SoftLayer – List, Sync, and Download data from IBM COS S3 bucket

This blog can be treated as an extension to my blog on “Delete IBM COS S3 files older than X days”

Today I shall be sharing PowerShell scripts to List, Sync and Download data from IBM COS S3 bucket.

Pre-requisite is to have the AWS CLI modules installed on the machine.

List all the objects in a bucket

  1. The script will expect AccessKey, SecretKey, and BucketName.
  2. It creates a log file to log the flow of the script.
  3. Make sure you set the endpoint to $endpoint variable. [This can also be made as a parameter to make the script more dynamic]
  4. The script then dynamically sets the “AWS CLI profile” with profile name “test.”
  5. Execute the AWS CLI command, and un-set the profile.
<#
This script will list all the objects in a bucket.
EXAMPLE:
.\list_backup_files.ps1 -AccessKey ‘ZVS9wEvUUYSG8f’ -SecretKey ‘ENOpkewRzAoHGWnvulL1KbNOIRp7rpWidkV’ -BucketName ‘abc-abc-abc’
#>
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abc.objstor.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Listing files from IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “”
aws –endpoint-url $endpoint s3 ls $bucket_name –recursive –human-readable –summarize –output json –profile test
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
Write-Host “Command that failed was: ” $cmd
}
}

 

Sync files between local location and IBM COS S3 bucket

The script expects a local directory path which has to be synced with the IBM COS S3 bucket.

<#
This script will list all the objects in a bucket.
EXAMPLE:
.\sync_from_local_to_s3.ps1 -AccessKey ZVS9wEvUUYSG8f -SecretKey ENOpkewRzAoHGWnrpWidkV -BucketName abc-abc-abcc -InputPath “C:\manju”
#>
param(
[String]$AccessKey = ”,
[String]$SecretKey = ”,
[String]$BucketName = ”,
[String]$InputPath = “”
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\sync_from_local_to_s3_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abcc.abccc.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Sync files between local path and IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “aws –endpoint-url $endpoint s3 sync $InputPath $BucketName –profile test”
aws –endpoint-url $endpoint s3 sync $InputPath s3://abc11-abc-abc –profile test ## Hardcoding the bucket name. Expects <S3Uri>. Throws errror if BucketName is passed as string.
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
$cmd | out-file $logfile -Append
}
}

 

Download files from IBM COS S3 bucket to local system

This script expects

  1. InputPath. This is a text file, that contains list of file names to be downloaded.
  2. OutputPath. This is an absolute file path where files will be downloaded
  3. FileNameToDownload. You can alternatively specify file name (comma seperated) to be downloaded. This is useful when you have 3-4 files to be downloaded. Providing file names is easier than creating a text file for a small number of files.
# This script will download backup files form IBM COS S3 bucket.
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName,
[String]$InputPath, # Contains list of file names to download
[String]$OutputPath, # Files will be downloaded to this folder
[String]$FileNameToDownload
)
<#
EXAMPLE:
param(
[String]$AccessKey = “xxxxxxxxxxxxxxxxxxxxxxxxxx”,
[String]$SecretKey = “xxxxxxxxxxxxxxxxxxxxxx”,
[String]$BucketName = “abc11-abc-abcc”,
[String]$InputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt”,
[String]$OutputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
)
.\download_backup_file.ps1 -AccessKey ZVS9wSG8f -SecretKey ENOpkewOIRp7rpWidkV -BucketName abc11-abc-abc -InputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt” -OutputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
#>
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
try {
# IBM COS S3 Endpoint
$endpoint = “https://abc.abc.abc.com&#8221;
# Bucket Name
$bucket_name = “s3://” + $BucketName
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
### Use the below logic to download files if a text file is provided ###
if($InputPath -ne $null){
# Verifying the input path
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
$files_to_download = Get-Content $InputPath
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
### Use the below login to download if the file names are provided individually via “$FileNameToDownload” argument ###
if($FileNameToDownload -ne $null){
$files_to_download = $FileNameToDownload.split(“,”)
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
log $cmd
}
}

 

Click here to download my PowerShell scripts for Free !!

PowerShell – Delete IBM Softlayer COS S3 files older than X days

IBM Softlayer uses many types of storage to store the data. One of which is Amazon S3. I have written a simple PowerShell script to delete IBM COS S3 files older than X days.

The PS Script uses AWS CLI commands so you will need AWS CLI installed on the windows machine from where you will run this script. The script can also be scheduled as a task to run every day.

Delete IBM COS S3 files older than X days

Click here to download my PowerShell scripts for Free !!

 

Azure – First look into Cloudlyn (Azure’s Cost management service)

This is a continuation of my first blog on Cloudlyn. Below is the link to my first blog. It guides you to register for Cloudlyn if you have an Azure subscription.

AZURE – COST MANAGEMENT BETTER THAN EVER USING CLOUDYN (REGISTRATION)

In this post, let us take a first-look into Cloudlyn’s Cost management console and the features Cloudlyn offers.

Once you have completed the registration with Cloudlyn as explained in my first blog (link shared above). You can access the Cloudlyn’s Cost management portal by:

  1. Log into Azure portal
  2. Navigate to “Cost Management + Billing.”
  3. Click on “Go to Cost Management” button
  4. A new window will be opened with an URL – https://azure.cloudyn.com/dashboard#/tool/enterprise_dashboard

Once you navigate to the URL, below is the page you see. This is Cloudlyn’s Management Dashboard.

image_7

 

Below is the Annual Projected Annual cost. I have used only Azure Storage and Azure Network. If you have used more services, their costs will also be projected here.

image_8

 

We can also view Current and Previous Month Projected Cost. This is very useful to track changes in infrastructure costing more than usual.

image_9

 

The “Actual Cost Analysis” will provide cost by Services, Providers, and Accounts. This graph can be further customized. Groups (highlighted below) provides plenty of options that we can check as per our requirement. Finally, you can choose an option from many of the Actions (highlighted below) on what to do with the report. Save / Copy  / Export etc.,

image_10

 

The “Actual Cost Over Time” lets you pull up a report to analyze the cost over a range of time.

image_11

 

Cloudlyn also offers an “Alert Management” feature that alerts you when certain thresholds are crossed as per alert’s configuration.

image_13

 

“Cost vs. Threshold” is one of many alerts. As you can see Cloudlyn offers you many options to customize the alert policy. Alerts will be sent to an email ID.

image_12

 

Finally, the last noticeable feature is “Data Transfer.” You can either Analyze data transfer usage or view the trend of data transfer.

image_14

 

These are the noticeable features that I felt to be very valuable.

If you found any other feature to be worth mentioning, let me know in the comments section.

If the content of this blog is valuable to you, do consider sharing with your friends and colleagues.

Click here to download my PowerShell scripts for Free !!

Azure – Cost Management Better Than Ever Using Cloudyn (Registration)

Microsoft’s acquisition of Cloudyn will help Azure customers manage and optimize their cloud usage. Read more about the acquisition here. A message from Sharon Wagner, CEO of Cloudyn.

About Cloudyn

Azure Cost Management by Cloudyn empowers organizations to monitor cloud spend, drive organizational accountability, and optimize cloud efficiency so they can accelerate future cloud investments with confidence.

Microsoft’s acquisition of Cloudyn will help Azure customers and partners as they face the challenges of growing their multi-cloud environments. It will enable them to gain visibility, understand and optimize cloud consumption, as well as accurately project future usage.

Microsoft will continue to support multi-cloud environments, including Azure, AWS, and GCP. Azure Cost Management by Cloudyn is available for free to customers and partners managing Azure spend. Additional premium capabilities are available at no cost through June 2018, once they will become paid features.

Let us look into how to sign up to Cloudyn if you are an Azure customer

Step 1: Login to your Azure Subscription via the Azure portal. Select the “Cost Management + Billing” blade. Then select “Cost Management” from the options on the left-hand side of the pane.

Click on “Go to Cost Management”

image_1

 

Step 2: Once you click the”Go to Cost Management” button, you will be redirected to  Cloudlyn’s page to set up your Cost Management details.

Enter your organization name and the type of Azure access you have on your Azure account. I have a personal subscription, so I have chosen as “Azure Individual Subscription Owner.”

image_2

 

Step 3: Cloudlyn Account name and Tenant ID will be automatically populated. Now, select the offer-ID from the drop-down list.

If you do not know your Offer-ID, then go back to your Azure portal. Click on “Subscriptions,” that should provide you the type of subscription that you have.

image3_1

 

Step 4: Click “Next”

image_4

 

Step 5: Click “Next”

image_5

Step 6: We are done with the registration with Cloudlyn. Cloudlyn needs about 2 hours for collecting the data.

image_6

 

If you find the content valuable, do consider sharing with your friends and colleagues.

Click here to download my PowerShell scripts for Free !!

AWS – Configure Point-To-Site in AWS envirnonment using OpenVPN

This document illustrates how we can create a VPN using the AMI – OpenVPN which is available in AWS MarketPlace.

Below shows the architecture that we would be able to achieve:

image1

We will set up an AWS VPC where we will launch our servers. One of the servers will be launched using the OpenVPN AMI. The other servers will be on public and private subnets within the VPC.

Steps to Achieve are as below:

Step 1: Login into AWS Console and choose VPC.

Step 2: Select Service – VPC

Step 3: Create a VPC. Provide the Name Tag and CIDR details. Also, create the subnet under this VPC.

image2

Step 4: Now, Let us go back to EC2 Service and then come back to VPC Later.

Step 5: In EC2 Service, Launch an EC2 Server and go to AWS Marketplace and choose the OpenVPN AMI. You can assign an elastic IP if necessary

image3

Step 6: Launch this instance with all the other necessary configuration. This AMI comes with default sec groups, keep them default and launch it. Note – This is not necessary to be launched in the VPC Created.

Step 7: Launch two more instance one with public IP and the other with private IP under the VPC created.

Step 8: Now let us go back to VPC. We will need to create Internet Gateway and RouteTable. Click on VPC in the console and go to Internet Gateway. Now click on Create Internet Gateway and provide a suitable Name tag. Attached this Gateway to the VPC we created earlier using the option “Attach to VPC”

image4

Step 9: Now, go to Route table and click on create route table. Update the Routes to have 0.0.0.0/0 (opens to the internet) and specify the IGW (internet gateway) which was created.

image5

Step 10: Update the Subnet Association to the

image6

Step 11: This completes the configuration on the AWS Services. Now we will need to configure the OpenVPN.

Step 12: SSH to the OpenVPN box using the username – openvpnas

Step 13: You will need to agree to the terms by typing “Yes”. Keep all the other setting default by just pushing the “Enter” button. By Default, the username will be “openvpn”. You will need to update the password using the command as per the below screenshot: sudo passwd openvpn.

image7_1

Step 14: Now go to the browser and type the IP address of the OpenVPN box. Ex: https://ipaddress

Step 15: Login in with the password updated earlier on the terminal.

image8

Step 16: Download the OpenVPN client as per the operating system. Install the client.

image9

Step 17: Click on the icon and select the OpenVPN server and click on connect. Once connected you will be able to connect to the servers within the vpc.

image10

Step 18: Now let us ping the Public IP and see the result.We were successful in pinging the Public Server under the VPC.

image11_1

Step 19: Now let us disconnect the OpenVPN and try to connect to the private server.

image12

Step 20: Now we see the ping requests are getting timed out. We are unable to reach the server using the private IP under the VPC.

image13_1

Step 21: Let us connect the VPN using the OpenVPN and try to ping the same Private Server.

image14_1

Looking for free PowerShell scripts? Check out my Powershell Contributions under Microsoft Technet Script Centre