Setup the Storage Account
You will (obviously) need an Azure Subscription for this to work. All of the steps below can be completed using a PowerShell script, so the advanced users will probably write that up. But I will here show have you can easily get this done with some clicking around. Still, you can set this all up in matter of minutes manually.Start with opening the Azure Portal and open "Storage Accounts". You will create yourself a new one.
You will ned to choose a Resource Group, or create a new one. I typically have a Resource Group I put "DynOps" stuff in, like this Storage Account.
I want to make this a cheap account, so I tweak the settings to save money. I opt for only Local Redundancy and a Cold Tier. Perhaps the most important setting is the Region. You will want to choose a region that is the same as the VMs you are using. You get better performance and save some money (not much, though, but still).
Oh, and also worth mentioning, the account name must be unique. There are a few naming guidelines for this, but simply put you will probably prefix it with some company name abbreviation. If you accidentally pick something already picked, you won't be able to submit the form, for good measure.
It only takes a few minutes for Azure to spin up the new account, so sit back, relax and take a sip of that cold coffee you've forgot to enjoy while it was still warm.
The next thing you'll do is open the newly created Storage Account, and then scroll down on the "things you can do with it" and locate "Blobs". You will create yourself a new blob, give it a name, like for example "backups" or just "blob". Take note of the name, as you will need it later.
Then you will want to get the Access key. About the Access key, it needs to be kept as secret as possible, since it basically grants access to the things you put into this Storage Account. If you later worry that the key has been compromised, you can regenerate the Access key, but then your own routines will have to get updated as well. There are some other ways to secure usage of the Storage Account, but for the sake of simplicity I am using the Access key in this example.
And now you are set. That entire thing literally takes just a few minutes, if the Azure Portal behaves and you didn't mess anything up.
Using the Storage Account
I've become an avid user of the PowerShell library D365FO.tools, so for the next example I will be using it. It is super easy to install and setup, as long as the VM has an Internet connection. I'm sure you will love it too.Assuming it is installed, I will first run a command to save the cloud Storage Account information on the machine (using the popular PSFramework). This command will actually save the information in the Registry.
# Fill in your own values $params = @{ Name = 'Default' # Just a name, because you can add multiple configurations and switch between them AccountId = 'uniqueaccountname' # Name of the storage account in Azure Blobname = 'backups' # Name of the Blog on the Storage Account AccessToken = 'long_secret_token' # The Access key } # Create the storage configuration locally on the machine Add-D365AzureStorageConfig @params -ConfigStorageLocation System -Verbose
Now let's assume you ran the command below to extract a bacpac of your sandbox Tier2 environment.
Import-Module d365fo.tools $dbsettings = Get-D365DatabaseAccess $baseParams = @{ DatabaseServer = $dbsettings.DbServer SqlUser = 'sqladmin' SqlPwd = 'SQLADMIN_PASSWORD_FROM_LCS' Verbose = $true } $params = $baseParams + @{ ExportModeTier2 = $true DatabaseName = $dbsettings.Database NewDatabaseName = $($dbsettings.Database + '_adhoc') BacpacFile = 'D:\Backup\sandbox_adhoc.bacpac' } Remove-D365Database @baseParams -DatabaseName $($params.NewDatabaseName) New-D365Bacpac @params
You now want to upload the bacpac (database backup) file to the blob in your cloud Storage Account using the following PowerShell script.
Set-D365ActiveAzureStorageConfig -Name 'Default' $StorageParams = Get-D365ActiveAzureStorageConfig Invoke-D365AzureStorageUpload @StorageParams -Filepath 'D:\Backup\sandbox_adhoc.bacpac' -DeleteOnUpload
The next thing you do, is jump over to the VM (Tier1, onebox) where you want to download the bacpac. Obviously, D365FO.tools must be installed there as well. Assuming it is, and assuming you've also run the command above to save the cloud Storage Account information on the machine, you can run the following PowerShell script to download.
Set-D365ActiveAzureStorageConfig -Name 'Default' $StorageParams = Get-D365ActiveAzureStorageConfig Invoke-D365AzureStorageDownload @StorageParams -Path 'D:\Backup' -FileName 'sandbox_adhoc.bacpac'
Finally, you would run something like this to import the bacpac to the target VM.
Import-Module d365fo.tools $bacpacFile = 'D:\Backup\sandbox_adhoc.bacpac' $sourceDatabaseName = "AxDB_Source_$(Get-Date -UFormat "%y%m%d%H%M")" #Remove any old temp source DB Remove-D365Database -DatabaseName $sourceDatabaseName -Verbose # Import the bacpac to local SQL Server Import-D365Bacpac -ImportModeTier1 -BacpacFile $bacpacFile -NewDatabaseName $sourceDatabaseName -Verbose #Remove any old AxDB backup (if exists) Remove-D365Database -DatabaseName 'AxDB_original' -Verbose #Stop local environment components Stop-D365Environment -All #Switch AxDB with source DB Switch-D365ActiveDatabase -DatabaseName 'AxDB' -NewDatabaseName $sourceDatabaseName -Verbose Start-D365Environment -All
Isn't that neat? Now you have a way to copy the database around, while we're waiting for this to be completely supported out of the box in LCS - fingers crossed!
No comments:
Post a Comment