Download files from Amazon s3 to local folder using Powershell

Download files from Amazon s3 to local folder using Powershell

As part of the services we provide our clients around Microsoft SQL Server we also provide a service that uploads database backup files to Amazon S3 or Azure Blob Storage (applies for On Premise servers). This option is really great and gives you lots of confidence letting you sleep better at nights. it had already been proven to be very useful in several cases such as in a Ransomware attack, regular cases of data corruption etc. The service constantly looks for backup files not yet uploaded and uploads them to the Cloud so no matter what happens the data is protected outside the data center.

For SQL Server instances located at Azure you should directly backup to an Azure Blob destination and the same applies for EC2 instances located at AWS you should direct the backup destination to S3.

When the service upload files it creates a folder based on the current date i.e. 20220718 for July 18th 2022. This allows to narrow the search latter on when the day comes and we are looking for files to download.

If you need to download a single file then using the browser is good enough, you can simply select the desired file and hit the download button. However, when multiple files are selected the download button becomes inactive by design making the task of downloading multiple files a pain in the axx.

Bellow is a Powershell script that I use for this task. You will need to add you secret key and access key in order to authenticate at Amazon S3. The $folder_date and $database are used to construct the $key_prefix variable that is passed to the Get-S3Object in order to access the specific “sub folder” at S3.

The script uses the Get-S3Object to retrieve the list of files from the given “sub folder” and then uses the Read-S3Object to iterate through the list of files downloading the files one by one single threaded.

[System.Object]$b = Get-S3Bucket -AccessKey $access_key -SecretKey $secret_key -Region $region ;
[string]$bucket = $b.BucketName;
[string]$region = Get-S3BucketLocation -BucketName $bucket; 
[string]$secret_key = ''; # <----- Edite here
[string]$access_key = ''; # <----- Edite here


[string]$folder_date = '20220709' # <----- Edite here
[string]$database = 'my_db'       # <----- Edite here


[string]$local_folder = 'B:\FromS3\' + $database + '\' + $folder_date;
if (-not (Test-Path -Path $local_folder)) { New-Item -Path $local_folder -ItemType Directory; } 


[string]$key_prefix = $folder_date + '/' + $database;


[System.Array]$files = Get-S3Object -BucketName $bucket -KeyPrefix $key_prefix -AccessKey $access_key -SecretKey $secret_key -Region $region;
foreach ($file in $files)
{
    $file_name = $file.Key.Split('/')[1];
    $full_file_name = Join-Path $local_folder $file_name;
    
    
    Write-Host $full_file_name 

    # Download
    if ($file_name -like '*.trn*')
    {
        Write-Host $full_file_name -ForegroundColor Green;
        Read-S3Object -BucketName $bucket -Key $file.Key -File $full_file_name -SecretKey $secret_key -AccessKey $access_key;
    }

}
The following two tabs change content below.
Yaniv Etrogi is an SQL Server consultant. He loves SQL Server and he is passionate about Performance Tuning, Automation and Monitoring.

Leave a Comment

Your email address will not be published. Required fields are marked *