In this post, I’d like to present the two simple Powershell scripts, that will let you create a secured (password protected zip archive) backup of the MSSQL or MongoDB databases and upload it either to the Azure or AWS cloud. By using such solution, you may actually save a few bucks instead of using some external, paid services like e.g. CherrySafe that do the same thing.
The prerequisites:
Of course, it’s up to you whether you prefer the Azure or AWS or even storing the backup on your local drive – the scripts presented below can work with any backup scenario. On top of that, the backup data will be put into the password protected zip archive before sending it to the cloud storage.
Let’s start with the MongoDB backup:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 |
$database = "MongoDB Database name" $username = "MongoDB Database username" $password = "MongoDB Database password" $host = "MongoDB host" $backupPath = 'C:\Backup\MongoDB\' $timestamp = get-date -format yyyyMMddHHmmss $directoryName = "$database-$timestamp" $directoryPath = Join-Path $backupPath $directoryName $archiveType = "7z" $archivePassword = "Password to the ZIP archive" $fileName = "$directoryName.$archiveType" $filePath = Join-Path $backupPath $fileName $s3Bucket = 'S3 Bucket name' $region = 'S3 Bucket region' $accessKey = 'S3 access key' $secretKey = 'S3 secret key' $storageAccountName = "Azure storage account name" $storageAccountKey = "Azure storage account key" $storageContext = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey $containerName = "Azure storage container name" function create-7zip([String] $aDirectory, [String] $aZipfile){ [string]$pathToZipExe = "$($Env:ProgramFiles)\7-Zip\7z.exe"; [Array]$arguments = "a", "-tzip", "$aZipfile", "$aDirectory", "-r", "-p$archivePassword"; & $pathToZipExe $arguments; } $Watch = New-Object System.Diagnostics.StopWatch $Watch.Start() Write-Host "Backing up '$fileName' to local directory: $backupPath." mongodump -h $host -d $database -u $username -p $password -o "$directoryPath" Write-Host "Creating archive $fileName" create-7zip $directoryPath $filePath Write-Host "Uploading '$fileName' to Amazon S3 storage." Write-S3Object -BucketName $s3Bucket -File $filePath -Key $fileName -Region $region -AccessKey $accessKey -SecretKey $secretKey Write-Host "Uploading '$fileName' to Azure Storage." Set-AzureStorageBlobContent -File $filePath -Container $containerName -Blob $fileName -Context $storageContext Remove-Item $directoryPath -Recurse -Force $Watch.Stop(); Write-Host "MongoDB backup completed in "$Watch.Elapsed.ToString() |
At the beginning you can see a bunch of variables that have to be set in the first place.
Then the mongodump fires up, next the zip archive is being created, and finally, the backup is being sent to the S3 and Azure storage respectively. Before the script finishes, it does remove the temporary backup from the local drive by calling Remove-Item.
And here’s the MSSQL backup script:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 |
$connectionString = "MSSQL connection string" $database = "MSSQL database name" $backupPath = 'C:\Backup\MSSQL\' $SqlInstallationFolder = "C:\Program Files (x86)\Microsoft SQL Server" $timestamp = get-date -format yyyyMMddHHmmss $sqlBackupType = "bacpac" $archiveType = "7z" $fileName = "$database-$timestamp" $filePath = Join-Path $backupPath "$fileName.$sqlBackupType" $fileArchive = "$fileName.$archiveType" $fileArchivePath = Join-Path $backupPath $fileArchive $archivePassword = "Password to the ZIP archive" $s3Bucket = 'S3 Bucket name' $region = 'S3 Bucket region' $accessKey = 'S3 access key' $secretKey = 'S3 secret key' $storageAccountName = "Azure storage account name" $storageAccountKey = "Azure storage account key" $storageContext = New-AzureStorageContext -StorageAccountName $StorageAccountName -StorageAccountKey $StorageAccountKey $containerName = "Azure storage container name" function create-7zip([String] $aDirectory, [String] $aZipfile){ [string]$pathToZipExe = "$($Env:ProgramFiles)\7-Zip\7z.exe"; [Array]$arguments = "a", "-tzip", "$aZipfile", "$aDirectory", "-r", "-p$archivePassword"; & $pathToZipExe $arguments; } $DacAssembly = "$SqlInstallationFolder\110\DAC\bin\Microsoft.SqlServer.Dac.dll" Add-Type -Path $DacAssembly $Services = new-object Microsoft.SqlServer.Dac.DacServices $connectionString if ($Services -eq $null) { Write-Host "Failed to load SQL Server Dac Services" exit } $Watch = New-Object System.Diagnostics.StopWatch $Watch.Start() Write-Host "Backing up Azure SQL database '$database' to local drive." $Services.ExportBacpac($filePath, $database) Write-Host "Creating archive $fileArchive" create-7zip $filePath $fileArchivePath Write-Host "Uploading '$fileArchive' to Amazon S3 storage." Write-S3Object -BucketName $s3Bucket -File $fileArchivePath -Key $fileArchive -Region $region -AccessKey $accessKey -SecretKey $secretKey Write-Host "Uploading '$fileName' to Azure Storage." Set-AzureStorageBlobContent -File $filePath -Container $containerName -Blob $fileName -Context $storageContext Remove-Item $filePath -Force $Watch.Stop(); Write-Host "Azure SQL backup completed in "$Watch.Elapsed.ToString() |
Pretty similar to the previous one, except that it makes use of the MSSQL backup services based on the SQL Server DAC.
I use such scripts in some of my projects, for example in the Warden for which I need the MongoDB backup in order to save the Web Panel database.
To make it work, you may schedule a new task by using the built-in Task Scheduler, and select a new action of type: Start a program. In the Program/script section set the: C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe and finally pass the path to the backup script in the Add arguments textbox e.g. C:\Tasks\mongodb-backup.ps1.
Surely, when the database grows to some big numbers, probably some other backup strategies would be more efficient, but unless you have tens of gigabytes of the data, these two scripts presented above should do the trick.