Thursday, November 1, 2018

Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB

Introduction

[EDIT]: Changes since November 2018 has forced me to made some changes to these series. I will point out the changes.

In these series of posts. I will try to run you through the process of how you can complete the upgrade from 7.x to 8.+ of Dynamics 365 for Finance and Operations.

Quick navigation:
Upgrade from 7.x to 8.+ series | Post 1 | Start in LCS
Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB (you are here)
Upgrade from 7.x to 8.+ series | Post 3 | Validate Code and Data in Dev

Deploy 8.x environments


Choose the version closest to the target version you are aiming for.

You will need to deploy new environments. There is no "in-place" upgrade of your existing environments. The new environments will be on the version you are upgrading to. Fortunately, deploying new environments is easy to do through Lifecycle Services (LCS). You will need a decent Development VM to connect to the upgraded metadata, and check the code for any issues.

I typically go for a DS13v2 which has local SSD. I normally give it 14 disks of size 48GB, which will all be striped for maximum throughput. This has served me well so far. I don't chose premium storage, but go for standard storage. There are probably lots of preference out there, and I'm more than willing to learn from the community what they recommend.

Make sure the VM is hosted on your own (or customers) Azure Subscription. This way you are guaranteed to get local admin user. Also make sure the topology is Development. Pick an empty database, as you won't need that Contoso data for what we're about to do.

Prepare Database

[EDIT]: This section has been edited.

You can simply Export the database through LCS portal. LCS will create a bacpac of the sandbox database and save it in the project Asset Library. You should consider Refresh the sandbox with a fresh copy of the Production database. All of these steps are now easily done directly in the LCS portal.


While the Development VM is deploying, here's another neat thing you can do, if you haven't already done so. Setup a cloud Storage Account in the Azure Subscription. It can be a cheap Standard Storage (general purpose v2) type, with only Local Redundancy, on a Cold Tier - nothing fancy. Create yourself a blob storage where you can put the database which you will get from your source environment. If you haven't done this in Azure Portal before, let this be your first time. Things to consider; the Storage Account name must be unique (for that specific Azure Region). But you're a good citizen, and always used a good naming practice, right?

You will need three things from this cloud storage:
  1. The Storage Account Name
  2. The name of the blob storage
  3. The Access Key (which is found on the Azure Blade - look for the yellow key icon).
When you have the storage account ready, I bet the deploying of the Development VM is still spinning, so let's prepare a backup of the source database. We will use it to validate the upgrade. This is just a test, to make sure the upgrade experience will be smooth.

Head over to your Sandbox (Tier2) AOS, and extract the database from there. If you want to test on a fresh copy from Production, you will have to get Microsoft to do a Database Refresh first. But let's assume the one on the Sandbox is fresh enough.

The possibly quickest and easiest way to get the database extracted, at the point of writing, and while we are waiting on Microsoft to get the tooling in place in LCS, is to use the community driven D365FO.tools PowerShell Library.

Install the library on the AOS server using the following command. You'll have to click "Yes" and "Yes to all" on any questions.

Install-Module d365fo.tools

Then run the following to extract the database. It basically prepares a bacpac where all the post-SQL are run, and saves it do the D-drive.

Import-Module d365fo.tools

$dbsettings = Get-D365DatabaseAccess

$baseParams = @{
    DatabaseServer = $dbsettings.DbServer
    SqlUser = 'sqladmin'
    SqlPwd = 'SQLADMIN_PASSWORD_FROM_LCS'
    Verbose = $true   
}
$params = $baseParams + @{
    ExportModeTier2 = $true
    DatabaseName = $dbsettings.Database
    NewDatabaseName = $($dbsettings.Database + '_adhoc')
    BacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
}

Remove-D365Database @baseParams -DatabaseName $($params.NewDatabaseName)
New-D365Bacpac @params

Then using the cloud storage you've hopefully prepared, lets upload the bacpac to the cloud. We will later download it to the development VM.

Import-Module d365fo.tools

$params = @{
    AccountID = 'STORAGE_ACCOUNT_NAME'
    AccessToken = 'LONG_AND_SECRET_ACCESS_KEY_FOUND_ON_THE_STORAGE_ACCOUNT_IN_THE_AZURE_PORTAL'
    Blobname = 'NAME_OF_THE_BLOB'
    FilePath = 'D:\Backup\sandbox_adhoc.bacpac'
    DeleteOnUpload = $false
}

Invoke-D365AzureStorageUpload @params

The database extract in form of a bacpac now awaits in the cloud storage, and when the development VM is ready, you can use the same PowerShell Library to download it and install it on your development VM.

But first, you need to make sure the application actually builds. I will address that in the next post.


No comments:

Post a Comment