ibm cloud object storage

IBM SoftLayer – List, Sync, and Download data from IBM COS S3 bucket

This blog can be treated as an extension to my blog on “Delete IBM COS S3 files older than X days”

Today I shall be sharing PowerShell scripts to List, Sync and Download data from IBM COS S3 bucket.

Pre-requisite is to have the AWS CLI modules installed on the machine.

List all the objects in a bucket

  1. The script will expect AccessKey, SecretKey, and BucketName.
  2. It creates a log file to log the flow of the script.
  3. Make sure you set the endpoint to $endpoint variable. [This can also be made as a parameter to make the script more dynamic]
  4. The script then dynamically sets the “AWS CLI profile” with profile name “test.”
  5. Execute the AWS CLI command, and un-set the profile.
<#
This script will list all the objects in a bucket.
EXAMPLE:
.\list_backup_files.ps1 -AccessKey ‘ZVS9wEvUUYSG8f’ -SecretKey ‘ENOpkewRzAoHGWnvulL1KbNOIRp7rpWidkV’ -BucketName ‘abc-abc-abc’
#>
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abc.objstor.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Listing files from IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “”
aws –endpoint-url $endpoint s3 ls $bucket_name –recursive –human-readable –summarize –output json –profile test
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
Write-Host “Command that failed was: ” $cmd
}
}

 

Sync files between local location and IBM COS S3 bucket

The script expects a local directory path which has to be synced with the IBM COS S3 bucket.

<#
This script will list all the objects in a bucket.
EXAMPLE:
.\sync_from_local_to_s3.ps1 -AccessKey ZVS9wEvUUYSG8f -SecretKey ENOpkewRzAoHGWnrpWidkV -BucketName abc-abc-abcc -InputPath “C:\manju”
#>
param(
[String]$AccessKey = ”,
[String]$SecretKey = ”,
[String]$BucketName = ”,
[String]$InputPath = “”
)
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\sync_from_local_to_s3_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
$cmdError = $false
try {
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
# IBM COS S3 Endpoint
$cmdError = $false
$cmd = “”
$endpoint = “https://abc.abcc.abccc.com&#8221;
$cmdError = $?
# Bucket Name
$cmd = “”
$bucket_name = “s3://” + $BucketName
$cmdError = $?
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
# AWS CLI command to list the files under bucket
$cmdError = $false
log “Sync files between local path and IBM COS S3 bucket: ”
log “=====================================================================================”
$cmd = “aws –endpoint-url $endpoint s3 sync $InputPath $BucketName –profile test”
aws –endpoint-url $endpoint s3 sync $InputPath s3://abc11-abc-abc –profile test ## Hardcoding the bucket name. Expects <S3Uri>. Throws errror if BucketName is passed as string.
$cmdError = $?
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
$cmd | out-file $logfile -Append
}
}

 

Download files from IBM COS S3 bucket to local system

This script expects

  1. InputPath. This is a text file, that contains list of file names to be downloaded.
  2. OutputPath. This is an absolute file path where files will be downloaded
  3. FileNameToDownload. You can alternatively specify file name (comma seperated) to be downloaded. This is useful when you have 3-4 files to be downloaded. Providing file names is easier than creating a text file for a small number of files.
# This script will download backup files form IBM COS S3 bucket.
param(
[String]$AccessKey,
[String]$SecretKey,
[String]$BucketName,
[String]$InputPath, # Contains list of file names to download
[String]$OutputPath, # Files will be downloaded to this folder
[String]$FileNameToDownload
)
<#
EXAMPLE:
param(
[String]$AccessKey = “xxxxxxxxxxxxxxxxxxxxxxxxxx”,
[String]$SecretKey = “xxxxxxxxxxxxxxxxxxxxxx”,
[String]$BucketName = “abc11-abc-abcc”,
[String]$InputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt”,
[String]$OutputPath = “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
)
.\download_backup_file.ps1 -AccessKey ZVS9wSG8f -SecretKey ENOpkewOIRp7rpWidkV -BucketName abc11-abc-abc -InputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\file_list_2_download.txt” -OutputPath “C:\Users\manjunath\Desktop\SQL Backup scripts\Output_Path”
#>
# Setting up the log file
$Loc = Get-Location
$Date = Get-Date -format yyyyMMdd_hhmmsstt
$logfile = $Loc.path + “\log_list_backup_files_” + $Date + “.txt”
Write-Host ‘The log file path: ‘ $logfile -ForegroundColor Green
####### Function to write informationn to log file #######
function log($string, $color){
if ($Color -eq $null) {$color = “white”}
write-host $string -foregroundcolor $color
$temp = “: ” + $string
$string = Get-Date -format “yyyy.MM.dd hh:mm:ss tt”
$string += $temp
$string | out-file -Filepath $logfile -append
}
# Flag to track if command failed
try {
# IBM COS S3 Endpoint
$endpoint = “https://abc.abc.abc.com&#8221;
# Bucket Name
$bucket_name = “s3://” + $BucketName
# Setting AWS Configure
log “Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id $AccessKey –profile test
aws configure set aws_secret_access_key $SecretKey –profile test
### Use the below logic to download files if a text file is provided ###
if($InputPath -ne $null){
# Verifying the input path
log “Verifying the ServerListFilePath”
if(!(Test-Path $InputPath)){
Write-Host “Please specify a text file containing list of files to download.”
}
$files_to_download = Get-Content $InputPath
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
### Use the below login to download if the file names are provided individually via “$FileNameToDownload” argument ###
if($FileNameToDownload -ne $null){
$files_to_download = $FileNameToDownload.split(“,”)
$cmdError = $false
log “Downloading files from S3 to specified output directory: ”
log “=====================================================================================”
cd $OutputPath
foreach($files_to_download_iterator in $files_to_download){
$source_file = $files_to_download_iterator.ToString()
$destination_file = $files_to_download_iterator.ToString()
$cmd = “aws –endpoint-url $endpoint s3 cp $source_file $destination_file”
aws –endpoint-url $endpoint s3api get-object –bucket $BucketName –key $source_file $destination_file
$cmdError = $?
}
}
# Un-Setting AWS Configure
log “Un-Setting AWS Configure”
aws configure set default.region ap-southeast-2 –profile test
aws configure set aws_access_key_id ‘ ‘ –profile test
aws configure set aws_secret_access_key ‘ ‘ –profile test
}catch{
if($cmdError -eq $false){
log “Command that failed was: ”
log $cmd
}
}

 

Click here to download my PowerShell scripts for Free !!

Advertisements

PowerShell – Delete IBM Softlayer COS S3 files older than X days

IBM Softlayer uses many types of storage to store the data. One of which is Amazon S3. I have written a simple PowerShell script to delete IBM COS S3 files older than X days.

The PS Script uses AWS CLI commands so you will need AWS CLI installed on the windows machine from where you will run this script. The script can also be scheduled as a task to run every day.

Delete IBM COS S3 files older than X days

Click here to download my PowerShell scripts for Free !!