Photo by Andre Taissin on Unsplash

This may or may not be helpful in the long term, but since I’m doing it to be super cautious, I figured I would blog about it. We migrated to Flex last week, and to be abundantly cautious, we’re putting the last single server backup into cold storage. You could also use this same process to offload Flex if you were going to delete a server and want to save a final backup or have some use case for saving backups to storage longer term.

This is primarily a repeat of another post I did, but I need to add on the storage part.

Create a VM

You must create a VM with Postgres installed and a drive mounted. You can find those instructions here. Do everything under that heading and all its associated subheadings.

Dump Postgres db and objects

Dumping the pg bits, globals, schema, and data. To keep it in alignment with my previous scripts and make it easier for myself.

## GLOBALS
export PGPASSWORD=yourpasswordhere
pg_dumpall -h sourcepgserver.postgres.database.azure.com -U username@sourcepgserver --roles-only -p 5432 > /yourfilepath/servername_roles_dump.sql
## SCHEMA DUMP
export PGPASSWORD=yourpassword
pg_dump -v -h yourserver.postgres.database.azure.com -U youradmin@yourserver -d yourdb -s  -f /yourpath/dbname_schema_dump.sql
## DATA DUMP
export PGPASSWORD=yourpassword
pg_dump -v -h yourserver.postgres.database.azure.com 
    -U youruser@yourserver 
    -d yourdb 
    -j 4 
    -a 
    -F d 
    -f /datadrive/dbname_data_dump

Copy into Azure Storage

This is the new part for me. You could try to sort out dumping directly to storage, but I feel more comfortable dumping on the VM then copying up to storage.

Install azcopy

# Download AzCopy
wget https://aka.ms/downloadazcopy-v10-linux
# Extract the package
tar -xvf downloadazcopy-v10-linux
# Move AzCopy to a directory in your PATH
sudo cp ./azcopy_linux_amd64_*/azcopy /usr/bin/
# Set permissions
sudo chmod +x /usr/bin/azcopy
# Verify install 
azcopy --version

Upload files

You’ll need a SAS token, which you can generate in the Azure portal. Navigate to the container, and make sure you choose an appropriate expiration. Since I couldn’t upload all of mine today, I chose a week out.

image 4

Once it generates, make sure to capture that URL. You aren’t seeing it again. If you lose it, you will have to generate another one. You might be able to query it with azcli. I’m not sure, but anyway, capture that URL because you can put it into the code below. Note that the following script will create a folder [dbfoldername] in the container. You don’t need to specify that folder in the storage account URL unless you want another folder.

azcopy copy "/datadrive/dbfoldername" "https://[storage-account-name].blob.core.windows.net/[container-name]/[destination-folder]?[SAS-token]" --recursive

I checked the portal to make sure it was in the container. There may be a way to query it via PowerShell or azcli, but this is a one-off, and I only needed to see that it’s there — and it is.

The post Save Azure PostgreSQL Backup to Storage appeared first on sqlkitty.

Share.
Leave A Reply