tag:blogger.com,1999:blog-33695038198286173362024-03-16T06:54:27.376+01:00Yet Another Dynamics AX BlogI have been working with Dynamics AX, now Dynamics 365, since 2009 and I am excited to work with such an amazing platform for building business solutions.tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.comBlogger96125tag:blogger.com,1999:blog-3369503819828617336.post-75533606741392319852019-01-25T21:13:00.002+01:002021-04-15T17:35:49.076+02:00Connecting Azure DevOps with Lifecycle Services for Release pipelinesA week ago, <a href="https://community.dynamics.com/365/financeandoperations/b/newdynamicsax/archive/2019/01/18/first-azure-devops-task-released" target="_blank">Microsoft annouced the release of a new Azure DevOps Task</a>. Followed by <a href="https://daxmusings.codecrib.com/2019/01/azure-devops-release-pipeline.html" target="_blank">more details the setup from the announcements author, Joris de Gruyter</a>. And with the help of <a href="http://dynamicsnavax.blogspot.com/2019/01/azure-devops-release-pipelinewalkthrough.html" target="_blank">this post by Business Application MVP, Munib Ahmed</a>, I sat down and ran through the setup a couple of days ago.<br />
<br />
I wanted to briefly add some additional thoughts as well, some considerations of my own, while we wait for the official documentation.<br />
<br />
This new feature is very quick and easy to setup, and is something everyone should adopt sooner than later. It shaves off the time spent downloading the complete build artifact somewhere and then uploading it to the Dynamics Lifecycle Services Project Asset Library. After a successful build of the full application you will get the package automatically "released" and uploaded to the asset library.<br />
<br />
We expect more "tasks" to be added, allowing us to setup a release pipeline that also let us automatically install a successful build on a designated target environment. So getting this setup now, and have the connection working, will set the stage for adding the next features as they are announced.<br />
<br />
Here are some of the requirements and considerations while you set this up:<br />
<br />
<ul>
<li>You need to register an Azure Application of type Native (public, multi-tenant). While it is said you can use the preview experience to register the app in the Azure Portal, I had to use the "non-preview" experience, to ensure I got a correct Native Azure app registration, and not a Web app. While you can add the necessary permissions setup (user_impersonation), you need to run the admin consent for the permissions to work. If you are setting it up, and you are not <a href="https://docs.microsoft.com/en-us/azure/active-directory/users-groups-roles/directory-assign-admin-roles#available-roles" target="_blank">Global Admin or Application Admin</a>, then you will need to get someone else with necessary permissions to run the admin consent part. </li>
<li>The connection also requires user credentials as part of the setup. This should not be a just any user, if you think about it. You don't want the connection to break just because the password was changed, or the user was disabled. Also multi-factor (or two-factor) authentication will not work here. So you might want to create yourself a dedicated user for this connection. The user does not need any licenses, just a normal cloud user you have setup and logged on with at least once. Also the user needs to be added to the Lifecycle Services project with at least Team member permissions (access to upload to Asset Library). Log on to LCS with the user once and verify access.</li>
<li>When you create the release pipeline for the first time, you will need to install the <a href="https://marketplace.visualstudio.com/items?itemName=Dyn365FinOps.dynamics365-finops-tools" target="_blank">Azure DevOps task extension</a>. Search for "Dynamics" and you should find the "Dynamics 365 Unified Operations Tools". If you are doing the setup with a user that has access to many different Azure DevOps organizations (ie. you're the partner developer doing this cross multiple customers), you will need to make sure you install it on the correct Azure DevOps Organization. When it is installed, you will have to refresh the task selection to see the new available task(s), including the new task named "Dynamics Lifecycle Services (LCS) Asset Upload". </li>
<li>When configuring the task, you will want to use variables to replace parts of the strings, like the name of the asset, the description, and so on. When you run a release, one of the steps actually list out all the available variables, though with a slightly different syntax. <a href="https://docs.microsoft.com/en-us/azure/devops/pipelines/release/variables?view=azdevops&tabs=batch#view-the-current-values-of-all-variables" target="_blank">Have a look at the list of available variables on this article, plus the top on how to see the values they are populated with upon a run</a>.</li>
</ul>
<div>
If you already have a successful build ready, go ahead and setup the release pipeline and run it once to see it succeed - or fail. If it fails while trying to connect, it could be one of the following errors:</div>
<div>
<ul>
<li>AADSTS65001: The user or administrator has not consented to use the application with ID '***' named 'NAME_OF_APP'. Send an interactive authorization request for this user and resource.<br />Here you have not successfully run the admin consent. Someone with proper permissions needs to run the "Grant permissions" in the Azure Portal.</li>
<li>AADSTS50076: Due to a configuration change made by your administrator, or because you moved to a new location, you must use multi-factor authentication to access.<br />This is most likely because the user credentials used on the connection is secured with multi-factor authentication. Either use a different account without MFA, or disable it. Most likely it is on for the account for a reason. </li>
</ul>
<div>
I would strongly encourage everyone to set this up, and feel free to <a href="https://community.dynamics.com/365/financeandoperations/f/765" target="_blank">use the community forums</a>, <a href="https://twitter.com/skaue" target="_blank">Twitter </a>or ask on the comments section, if you run into issues.</div>
</div>
tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-66966663860415706522019-01-12T21:40:00.001+01:002019-01-13T19:41:36.122+01:00Important changes to how we will update the production going forwardNot long ago, it was <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/deployment/infrastructure-stack" target="_blank">published on docs some updates on how we will be doing self-servicing of our non-production environments</a>. There is one particular statement I want to address with this post:<br />
<blockquote class="tr_bq">
<i>You will need to provide a combined deployable package for customizations. That is, all custom extension packages, including ISV packages, must be deployed as a single software deployable package. You will not be able to deploy one module at a time. This was always a recommended best practice <b><u>and is now enforced</u></b>.</i></blockquote>
<div>
It has been recommended for a long while to always ensure any changes to the application, either updates or when adding new modules, are serviced first to the sandbox and then to production in one single package. And then I mean the same single package containing everything which is not already in standard. This would include:</div>
<div>
<ul>
<li>ISV solutions and deliveries</li>
<li>Customer specific customizations (ie from your partner)</li>
<li>Hotfixes (relevant only for version 8.0 and older supporting individual hotfixes)</li>
</ul>
<div>
It may seem counter-intuitive and unnecessary to do it like this. You may argue we should be able to push individual modules and updates to the sandbox and to production. Why enforce "single package"? What are the benefits of this?</div>
<div>
<br /></div>
<div>
The core reason is that we are moving away from actually updating the application, but we are instead actually replacing the application. The application becomes an immutable piece of software, that does not change or is susceptible to change. </div>
</div>
<div>
<br /></div>
<div>
This means the "updated" version of the application instead is setup and runs on a new instance, replacing the previous version of the application. The new pattern will sustain and ensure the following expectations:</div>
<div>
<br /></div>
<div>
<b>Predictability </b>- One single package having all the code and metadata, and all the module dependencies secured, ensures the same behavior every time it is used in a deployment.</div>
<div>
<br /></div>
<div>
<b>Repeatability </b>- Repeating the same package deployment on the next environment should have little to no risk, for example when repeating the install done in sandbox when installing in production.</div>
<div>
<br /></div>
<div>
<b>Recovery and Rollback</b> - The single package is a last good known state, which we can rollback to, in case we need to recover due to deployment failure. </div>
<div>
<br /></div>
<div>
<b>Scale-out</b> - Reusing the same single package lets us easily and safely repeat deployment on new instances, and allowing for a safe and easy way to scale out.</div>
<div>
<br /></div>
<div>
<b>Portability </b>- The same package can be safely used if you for whatever reason needs to relocate your installation somewhere else around the world.</div>
<div>
<br /></div>
<div>
Think about all the benefits of this. It is music in my ears. </div>
<div>
<br /></div>
<div>
In theory, today, you could service your sandbox and production with individual isolated packages. You could handle the order you install the packages, to ensure not breaking dependencies. You could install in a test environment to try test updated versions of packages, in order to see if they broke something in conjunction with any of the other existing packages. It is possible - but it is not recommended. Instead, it is recommended to use the build and release procedures to create a single deployable package, containing all the changes, across all modules and packages.</div>
<div>
<br /></div>
<div>
Reading the announcement of the new immutable pattern for self-servicing environments is great news. It may change the way you've published updates in the past, if you used to push individual "delta" packages. If you've published single deployable packages from your build environment, you're following best practices, and the enforcement will not change how you've done things. </div>
<div>
<br /></div>
<div>
What do you think about this? Share your comments and thoughts either in the comments <a href="https://twitter.com/skaue" target="_blank">or on Twitter</a>.</div>
tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-3315054956602216142018-12-08T18:17:00.000+01:002018-12-08T19:00:50.574+01:00Controlling the Visual Studio workspaces to your Dyn365FO developers<h2>
Introduction</h2>
At the time of writing, doing development for Microsoft Dynamics 365 for Finance and Operations (FnO) require a dedicated development machine. This machine is pre-configured with Visual Studio extension that allow for achieving FnO development. One important and perhaps peculiar fact with these environments is the fixed disk location where you can create, edit, save and build your modifications. The folder is more often referred to the "<b>Package Local Directory</b>", but I will use the acronym <b>PLD </b>in the rest of this article. This is the folder containing the packages/modules, and you may have them either with or without source code. Typically vendor solutions are shipped only in their binary form, meaning you do not have the "design time" metadata, but only the "run time" payload. As for Microsofts own packages/modules, you will typically have both the metadata and the binaries, allowing you to view and step through code while developing and debugging. And of course, your customizations (assuming you are a developer) will have its metadata, and after your local first build, its binary counterpart (plus other artifacts, depending on what you're creating).<br />
<br />
<h2>
Workspace</h2>
This post is about <a href="https://docs.microsoft.com/en-us/azure/devops/repos/tfvc/create-work-workspaces" target="_blank">the "workspaces" in Visual Studio</a>.<br />
<br />
So what is the deal with the "workspace"? Well, it is a necessity when you want to develop. It is basically what defines the paths to your local copy of the code you are sharing with the rest of the team. And for FnO there is one path that is fixed, out of the box, and that is the <b>PLD</b>. You are free to setup other folders, and share them with the team, but the workspace needs to have at least one entry that refers to the <b>PLD</b>. All customizations you want to share with the team, and share with the BUILD machine, needs to be in the <b>PLD</b>, and added to source control and committed to the central code repository.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-hsuUSFwkilQ/XAv5jBE8-QI/AAAAAAACNjM/6ri2mRpdqwEBNgzsw_VpAevlwPk6lrgLgCLcBGAs/s1600/Workspaces.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1172" data-original-width="1496" height="500" src="https://4.bp.blogspot.com/-hsuUSFwkilQ/XAv5jBE8-QI/AAAAAAACNjM/6ri2mRpdqwEBNgzsw_VpAevlwPk6lrgLgCLcBGAs/s640/Workspaces.png" width="640" /></a></div>
<br />
<br />
Ok, so that is all fine. Any other considerations?<br />
<br />
Yes! The developer who creates the workspace actually ends up being the "owner" of the workspace, on that machine. So if another developer connects to the same machine and wants to develop, using their own user and credentials, the second developer needs to be able to work against a workspace pointing to the <b>PLD</b>. Otherwise, the second developer is blocked from doing development.<br />
<br />
So what's the problem? Well, by default, a newly created workspace is private and only the owner of the workspace can use it. To make things worse, any other user who wants to create their own workspace on the same machine will not be able to point it to the <b>PLD</b>. Only one workspace can point to a single folder at that machine at a time, and the <b>PLD </b>is such a fixed single folder (at the time of writing).<br />
<br />
There is a solution, though. The initial owner needs to change the workspace from "<b>Private</b>" to "<b>Public</b>", allowing any other developer connecting to the same machine reuse the initial workspace.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-YMgqbafPhZo/XAwCLMkHWlI/AAAAAAACNjg/9FIjfHa0QAsVrQAFjbBtNeUJxeQH9RpjACLcBGAs/s1600/workspace_settings.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="526" data-original-width="798" height="419" src="https://3.bp.blogspot.com/-YMgqbafPhZo/XAwCLMkHWlI/AAAAAAACNjg/9FIjfHa0QAsVrQAFjbBtNeUJxeQH9RpjACLcBGAs/s640/workspace_settings.png" width="640" /></a></div>
<br />
<br />
This is a simple solution where the same development machine is shared between developers. It is also smart if for any reason a developer has pending changes on that machine, then takes a few weeks of holiday, and another developer needs to connect and commit them. Yea, that can happen.<br />
<br />
<h2>
Administer the Workspaces</h2>
Ok, so what if the developers create the workspaces themselves, and set them up as Private, forgets and then someone else have to reuse it. Or if you simply want to go through and check the created workspaces out there.<br />
<br />
Well, the workspaces and information about them is also stored centrally, and <a href="https://docs.microsoft.com/en-us/previous-versions/y901w7se(v=vs.80)#security" target="_blank">someone with the "AdminWorkspaces" privilege can change them</a> (a permission by default granted to the Azure DevOps (VSTS) Organization Security group called "Project Collections Administrators").<br />
<br />
So in this post I will show how you can do this. There are several articles and posts out there discussing this, but it's always nice to share this in the context of Dyn365FO development, in my opinion.<br />
<br />
If you have the necessary permissions, you can run the "Developer Command Promt for VS2015" available on one of your development VMs. I here assume you have run Visual Studio at least once, and connected to your Azure DevOps (VSTS) organization you are working towards.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-l36M79vaWTQ/XAwBZTmBNjI/AAAAAAACNjY/joUWt1by-RIzMr7wfnTbnWpoSmCpBkU0QCLcBGAs/s1600/VS2015_DevCLI.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="177" data-original-width="359" height="195" src="https://3.bp.blogspot.com/-l36M79vaWTQ/XAwBZTmBNjI/AAAAAAACNjY/joUWt1by-RIzMr7wfnTbnWpoSmCpBkU0QCLcBGAs/s400/VS2015_DevCLI.png" width="400" /></a></div>
<br />
<br />
If you run the following command, it will list all the workspaces created.<br />
<span style="font-family: "courier new" , "courier" , monospace;">tf workspaces /owner:*</span><br />
<br />
You will see a list of workspaces by the name, owner and machine. The next thing you can do is edit one of the workspaces by running the following command:<br />
<span style="font-family: "courier new" , "courier" , monospace;">tf workspace WORKSPACENAME;OWNER </span><br />
<br />
When referring the owner, use the email address for simplicity.<br />
<br />
The workspace form now lets you change the properties like permissions from Private to Public, and you can even change the owner (again, use the email address) if you for example need to take over the workspace of someone who has since been deleted as user.<br />
<br />
You can also remove old and obsolete workspaces by using the following command:<br />
<span style="font-family: "courier new" , "courier" , monospace;">tf workspace /delete WORKSPACENAME;OWNER</span><br />
<br />
It goes without saying, changing the workspaces while they are in use, is obviously not very smart. Change the workspaces with care, or you might ruin someone elses work and day.<br />
<br />
<h3>
Using Team Foundation Sidekicks for VS2015</h3>
There is another option as well, a free tool that also lets you administer workspaces, if your user has the necessary permissions.<br />
You can download it from here:<br />
<a href="http://www.attrice.info/downloads/index.htm#tfssidekicks2015">http://www.attrice.info/downloads/index.htm#tfssidekicks2015</a><br />
(Tip! Use Google Chrome to download the MSI, if Edge/IE blocks you)<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-22754509217858201792018-11-25T22:11:00.003+01:002018-11-26T08:59:29.541+01:00Considerations when "upgrading" Dyn365FO from 8.0 to 8.1 Version 8.0 of Microsoft Dynamics 365 for Finance and Operations was released summer 2018. Just a few months later, in October, version 8.1 was released. If you have environments running version 8.0, let it be development environments, demo environments, or even production and (tier 2+) sandboxes, you might be thinking about getting them "upgraded" to 8.1.<br />
It's not really an upgrade, but actually rather an <b><u>update</u></b>.<br />
<br />
The overall process is actually a lot easier compared with coming from 7.x. I did a series of posts on how to get started here:<br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-1.html">https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-1.html</a><br />
<br />
Microsoft outlines the process in <u>one single article</u> here:<br />
<a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/appupdate-80-81">https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/appupdate-80-81</a><br />
<br />
<h2>
Why update?</h2>
One of the main differences between 8.0 and 8.1 is the latter version will be a lot easier to service with updates. Version 8.0 still supports individual application hotfixes, meaning you will download and apply them, put them in VSTS, just as you would with 7.x. You could argue the possibility to pick individual hotfixes and avoid taking all updates is a good thing, but in fact it is not the way forward. Instead of thinking that you may have to avoid hotfixes, and potentially have to "roll back" updates that breaks you, you need to shift to a mindset where <b>any ongoing issues are immediately reported back to Microsoft which allows them to ship new updates that resolves any issues</b>, not only for you, but for us all. With that mindset, you will want to take the 8.1 version, which does not allow for individual hotfixes, but instead gives you everything cumulative at the point you pick updates. This is also how "One Version" will behave, and on April 2019 you will be getting updates in this fashion.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-zovAvXLQ5cI/W_r_TZBOuxI/AAAAAAACNek/kRUfaEJmPKg_7D25sFOiGASarr7B4JIRgCLcBGAs/s1600/Updates.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="814" data-original-width="1225" height="265" src="https://4.bp.blogspot.com/-zovAvXLQ5cI/W_r_TZBOuxI/AAAAAAACNek/kRUfaEJmPKg_7D25sFOiGASarr7B4JIRgCLcBGAs/s400/Updates.png" width="400" /></a></div>
<br />
So in effect, when servicing 8.1+ you get only one update tile, and it contains everything, and you download everything cumulative. You'll use the complete update package to patch your environments, and there is no need to put the updates in VSTS either. Things are just so much easier.<br />
<br />
<h2>
Development and build environments</h2>
Even though Microsoft has a Software Deployable Package that does the update from 8.0 to 8.1 in the Shared Asset library in LCS, <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/appupdate-80-81#deploying-the-81-binary-update-to-developer-environments-causes-applicationsuite-compilation-errors" target="_blank">it is recommended that you deploy new 8.1 build and development environments</a>. Why is that, you may ask. For a development environment, you will have both source code and a runtime (code compiled). Your 8.0 development environment might even have been updated with hotfixes, added back in time. Part of the process is to remove any 8.0 updates, and start from scratch with 8.1. So when you start removing already committed Microsoft application updates form Azure DevOps (VSTS), you cannot avoid this to also reflect your local copy of the source code.<br />
<br />
But you do not need to compile Microsofts packages, so who cares if the code is wrong? Well, what if you want to debug, extend, view code? Even though you do not need to recompile Microsofts packages, you run the risk of having invalid, incomplete or even erroneous code on your development environment. So it just follows your best option is to redeploy a new set of development boxes and of course build box(es), and depending on your choice of server size and storage, the deploy of new servers they <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/appupdate-80-81#deploy-81-developer-and-build-environments" target="_blank">might be ready for you within 3-4 hours</a>.<br />
<br />
But before you connect the newly deployed development environments to the source code, <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/appupdate-80-81#begin-branch-work-for-version-control-and-remove-any-application-hotfixes" target="_blank">it is paramount that you prepare a new 8.1 branch</a>, which is clean from updates. It may contain 8.0 extension modules, but not any Microsoft modules. You can prepare all of this while the new environments are being deployed.<br />
<h2>
Non-development environments</h2>
What about demo, test and sandboxes? Well, typically you do not care about the source code on the demo boxes (even though it might be there), and as for acceptance test sandboxes, where you do not even have Visual Studio, it definitely doesn't matter. These environments you could just go ahead and update using the Software Deployable Package.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-1UAyc4-b3Ss/W_sBxZKLhAI/AAAAAAACNfE/n5zB8Qc0-QQibb-y9jRf9IxJlpOuVTuTgCLcBGAs/s1600/8.1_update_package.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="596" data-original-width="1026" height="368" src="https://2.bp.blogspot.com/-1UAyc4-b3Ss/W_sBxZKLhAI/AAAAAAACNfE/n5zB8Qc0-QQibb-y9jRf9IxJlpOuVTuTgCLcBGAs/s640/8.1_update_package.png" width="640" /></a></div>
<br />
<br />
Well, unfortunately it might not just be that simple. If the environment has other non-Microsoft packages installed, LCS will prevent you from simply apply the update package. You may have some ISV-solutions or some package you've created and released, and then installed on the environment, through LCS.<br />
LCS knows about this, and can list the non-Microsoft packages installed. In fact, if you try apply the update package, LCS will stop you, and list the packages blocking you.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-GCgKoO2k5iQ/W_sAhw5_qkI/AAAAAAACNew/WYuFp5b5y1MTlqXqR-JzeG1VkYI-py02gCLcBGAs/s1600/update_error.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="191" data-original-width="1312" height="91" src="https://4.bp.blogspot.com/-GCgKoO2k5iQ/W_sAhw5_qkI/AAAAAAACNew/WYuFp5b5y1MTlqXqR-JzeG1VkYI-py02gCLcBGAs/s640/update_error.png" width="640" /></a></div>
<br />
<br />
<i>Error: "Modules on the environment do not match with modules in the package. Missing modules: [...]"</i><br />
<br />
In order to continue, you will need to get a pre-compiled version of these modules where they were built on a 8.1 environment. Depending on your scenario, that either means getting the 8.1 version from a vendor or partner, or simply just get your package built and released through your new and shiny 8.1 boxes.<br />
<br />
As it is stated in the upgrade guide, you are recommended to prepare yourself <u>one single build release</u> of all the extension modules and packages. When you have the 8.1 package ready in the Asset Library, <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/appupdate-80-81#merge-the-deployable-package-with-the-81-binary-update-package" target="_blank">you can simply merge it with the update package</a>, and execute the update.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-KZ8V9vKnOsU/W_sBFsT8BSI/AAAAAAACNe4/D0si2UkPZowpW_UzGFTn2okxtM-bhpHMgCLcBGAs/s1600/merged_update.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1273" data-original-width="749" height="400" src="https://1.bp.blogspot.com/-KZ8V9vKnOsU/W_sBFsT8BSI/AAAAAAACNe4/D0si2UkPZowpW_UzGFTn2okxtM-bhpHMgCLcBGAs/s400/merged_update.png" width="235" /></a></div>
<br />
If all your demo and test environments where using the same set of non-Microsoft packages and modules, you'll simply reuse the same merged package to update all of them.<br />
<br />
Happy <b><u>updating</u></b>!<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-91943143931964767702018-11-17T18:14:00.002+01:002018-11-23T13:09:32.339+01:00Setup a cloud storage for database copy operationsThis post will show you how quickly and easily you can setup a cloud storage, and then copy the database around between your environments. Having said that, we are waiting on this feature in LCS, and eventually there will be tooling that does this for us in a fully managed way. However, while we are waiting, we can set this up ourselves.<br />
<br />
<h2>
Setup the Storage Account</h2>
You will (obviously) need an Azure Subscription for this to work. All of the steps below can be completed using a PowerShell script, so the advanced users will probably write that up. But I will here show have you can easily get this done with some clicking around. Still, you can set this all up in matter of minutes manually.<br />
<br />
Start with opening the Azure Portal and open "Storage Accounts". You will create yourself a new one.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-w6L2fuXJhAs/W_BGfZoyehI/AAAAAAACNbs/F7FRQOoSeyEdQFJr8xgui7RXBBCK_Z-HQCLcBGAs/s1600/1-CreateStorageAccount.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="268" data-original-width="832" height="206" src="https://4.bp.blogspot.com/-w6L2fuXJhAs/W_BGfZoyehI/AAAAAAACNbs/F7FRQOoSeyEdQFJr8xgui7RXBBCK_Z-HQCLcBGAs/s640/1-CreateStorageAccount.png" width="640" /></a></div>
<br />
<br />
You will ned to choose a Resource Group, or create a new one. I typically have a Resource Group I put "DynOps" stuff in, like this Storage Account.<br />
<br />
I want to make this a cheap account, so I tweak the settings to save money. I opt for only <a href="https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy-lrs" target="_blank">Local Redundancy</a> and a <a href="https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers#cool-access-tier" target="_blank">Cold Tier</a>. Perhaps the most important setting is the Region. You will want to choose a region that is the same as the VMs you are using. You get better performance and save some money (not much, though, but still).<br />
<br />
Oh, and also worth mentioning, the account name must be unique. There are <a href="https://docs.microsoft.com/en-us/azure/architecture/best-practices/naming-conventions" target="_blank">a few naming guidelines for this</a>, but simply put you will probably prefix it with some company name abbreviation. If you accidentally pick something already picked, you won't be able to submit the form, for good measure.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-n8YxroGriyI/W_BGnq9EVGI/AAAAAAACNbw/HDmxHZKhMAIIICkNX2iFikXX_0RYHFoqwCLcBGAs/s1600/2_createstorage.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="552" data-original-width="679" height="520" src="https://2.bp.blogspot.com/-n8YxroGriyI/W_BGnq9EVGI/AAAAAAACNbw/HDmxHZKhMAIIICkNX2iFikXX_0RYHFoqwCLcBGAs/s640/2_createstorage.png" width="640" /></a></div>
<br />
<br />
It only takes a few minutes for Azure to spin up the new account, so sit back, relax and take a sip of that cold coffee you've forgot to enjoy while it was still warm.<br />
<br />
The next thing you'll do is open the newly created Storage Account, and then scroll down on the "things you can do with it" and locate "Blobs". You will create yourself a new blob, give it a name, like for example "backups" or just "blob". Take note of the name, as you will need it later.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-AbCXOf_QXgM/W_BGt14wPvI/AAAAAAACNb0/ctDRXNbt-qIVTRc5FwraTAhP0X8cy7-uQCLcBGAs/s1600/3_createstorage.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="395" data-original-width="950" height="266" src="https://1.bp.blogspot.com/-AbCXOf_QXgM/W_BGt14wPvI/AAAAAAACNb0/ctDRXNbt-qIVTRc5FwraTAhP0X8cy7-uQCLcBGAs/s640/3_createstorage.png" width="640" /></a></div>
<br />
<br />
Then you will want to get the Access key. About the Access key, it needs to be kept as secret as possible, since it basically grants access to the things you put into this Storage Account. If you later worry that the key has been compromised, you can regenerate the Access key, but then your own routines will have to get updated as well. <a href="https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1" target="_blank">There are some other ways to secure usage of the Storage Account</a>, but for the sake of simplicity I am using the Access key in this example.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-jsUoyFciqW8/W_BGy8hcDCI/AAAAAAACNb4/FQJZNnVlLdUURy7io9oQm5SoLktpno3TwCLcBGAs/s1600/4_createstorage.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="507" data-original-width="943" height="344" src="https://3.bp.blogspot.com/-jsUoyFciqW8/W_BGy8hcDCI/AAAAAAACNb4/FQJZNnVlLdUURy7io9oQm5SoLktpno3TwCLcBGAs/s640/4_createstorage.png" width="640" /></a></div>
<br />
<br />
And now you are set. That entire thing literally takes just a few minutes, if the Azure Portal behaves and you didn't mess anything up.<br />
<br />
<h2>
Using the Storage Account</h2>
I've become an avid user of the <a href="https://www.powershellgallery.com/packages/d365fo.tools" target="_blank">PowerShell library D365FO.tools</a>, so for the next example I will be using it. It is super easy to install and setup, as long as the VM has an Internet connection. I'm sure you will love it too.<br />
<br />
Assuming it is installed, I will first run a command to save the cloud Storage Account information on the machine (using <a href="http://psframework.org/" target="_blank">the popular PSFramework</a>). This command will actually save the information in the Registry.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-xQT86TdvlHI/W_BHNsb0G_I/AAAAAAACNcE/yuz1hSFPZ4o3S2qxci4vzEJtvQWQspC3ACLcBGAs/s1600/Registry_PSFramework.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="224" data-original-width="619" height="230" src="https://1.bp.blogspot.com/-xQT86TdvlHI/W_BHNsb0G_I/AAAAAAACNcE/yuz1hSFPZ4o3S2qxci4vzEJtvQWQspC3ACLcBGAs/s640/Registry_PSFramework.png" width="640" /></a></div>
<br />
<pre class="brush:powershell;"># Fill in your own values
$params = @{
Name = 'Default' # Just a name, because you can add multiple configurations and switch between them
AccountId = 'uniqueaccountname' # Name of the storage account in Azure
Blobname = 'backups' # Name of the Blog on the Storage Account
AccessToken = 'long_secret_token' # The Access key
}
# Create the storage configuration locally on the machine
Add-D365AzureStorageConfig @params -ConfigStorageLocation System -Verbose
</pre>
<br />
Now let's assume you ran the command below to extract a bacpac of your sandbox Tier2 environment.<br />
<br />
<pre class="brush:powershell;">Import-Module d365fo.tools
$dbsettings = Get-D365DatabaseAccess
$baseParams = @{
DatabaseServer = $dbsettings.DbServer
SqlUser = 'sqladmin'
SqlPwd = 'SQLADMIN_PASSWORD_FROM_LCS'
Verbose = $true
}
$params = $baseParams + @{
ExportModeTier2 = $true
DatabaseName = $dbsettings.Database
NewDatabaseName = $($dbsettings.Database + '_adhoc')
BacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
}
Remove-D365Database @baseParams -DatabaseName $($params.NewDatabaseName)
New-D365Bacpac @params
</pre>
<br />
You now want to upload the bacpac (database backup) file to the blob in your cloud Storage Account using the following PowerShell script.<br />
<br />
<pre class="brush:powershell;">Set-D365ActiveAzureStorageConfig -Name 'Default'
$StorageParams = Get-D365ActiveAzureStorageConfig
Invoke-D365AzureStorageUpload @StorageParams -Filepath 'D:\Backup\sandbox_adhoc.bacpac' -DeleteOnUpload
</pre>
<br />
The next thing you do, is jump over to the VM (Tier1, onebox) where you want to download the bacpac. Obviously, D365FO.tools must be installed there as well. Assuming it is, and assuming you've also run the command above to save the cloud Storage Account information on the machine, you can run the following PowerShell script to download.<br />
<br />
<pre class="brush:powershell;">Set-D365ActiveAzureStorageConfig -Name 'Default'
$StorageParams = Get-D365ActiveAzureStorageConfig
Invoke-D365AzureStorageDownload @StorageParams -Path 'D:\Backup' -FileName 'sandbox_adhoc.bacpac'</pre>
<br />
Finally, you would run something like this to import the bacpac to the target VM.<br />
<br />
<pre class="brush:powershell;">Import-Module d365fo.tools
$bacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
$sourceDatabaseName = "AxDB_Source_$(Get-Date -UFormat "%y%m%d%H%M")"
#Remove any old temp source DB
Remove-D365Database -DatabaseName $sourceDatabaseName -Verbose
# Import the bacpac to local SQL Server
Import-D365Bacpac -ImportModeTier1 -BacpacFile $bacpacFile -NewDatabaseName $sourceDatabaseName -Verbose
#Remove any old AxDB backup (if exists)
Remove-D365Database -DatabaseName 'AxDB_original' -Verbose
#Stop local environment components
Stop-D365Environment -All
#Switch AxDB with source DB
Switch-D365ActiveDatabase -DatabaseName 'AxDB' -NewDatabaseName $sourceDatabaseName -Verbose
Start-D365Environment -All
</pre>
<br />
Isn't that neat? Now you have a way to copy the database around, while we're waiting for this to be completely supported out of the box in LCS - fingers crossed!tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-1855027903174910052018-11-01T00:24:00.001+01:002019-03-22T14:43:48.894+01:00Upgrade from 7.x to 8.+ series | Post 5 | Upgrade Sandbox and finally Production<h2>
Introduction</h2>
<b><i><span style="color: red;">[EDIT]: Changes since November 2018 has forced me to made some changes to these series. I will point out the changes.</span></i></b><br />
<b><i><br /></i></b>
In these series of posts. I will try to run you through the process of how you can complete the upgrade from 7.x to 8.+ of Dynamics 365 for Finance and Operations.<br />
<br />
Quick navigation:<br />
<a href="http://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-1.html">Upgrade from 7.x to 8.+ series | Post 1 | Start in LCS</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-2.html">Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html">Upgrade from 7.x to 8.+ series | Post 3 | Validate Code and Data in Dev</a><br />
<div>
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-4.html">Upgrade from 7.x to 8.+ series | Post 4 | Setup a new Build</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-5.html">Upgrade from 7.x to 8.+ series | Post 4 | Upgrade Sandbox and finally Production</a> (you are here)<br />
<br /></div>
<h2>
Prepare a sandbox upgrade for validation</h2>
<b><i><span style="color: red;">[EDIT]: This section has been modified.</span></i></b><br />
<b><i><span style="color: red;"><br /></span></i></b>
Before you can go ahead and request an upgrade of Production, you will want to do a pre-production validation in the sandbox environment. You may read the details here:<br />
<a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-sandbox-environment">https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-latest-update#upgrade-your-tier2-standard-acceptance-test-sandbox-environment</a><br />
<br />
Before you start this process, you will want to make sure you have the following uploaded to LCS Asset Library:<br />
<ul>
<li>Upgraded application ("Application deployable package"), downloaded or released from the successfull build</li>
<li>The Update package ("Platform and application binary package") which you prepared <a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html" target="_blank">on step 3 in the series</a>, and installed on the build <a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-4.html" target="_blank">in step 4 of the series</a>. </li>
</ul>
<div>
You will service the sandbox and install the packages. If you're smart, you will merge the two packages into one single package, and service them together in one single operation. Merging package works as long as long as they are on same platform version.<br />
<br />
Now you can let the users start hammering on the system to potentially discover everything is flawless (knock on wood).</div>
<div>
<div>
<br />
<div>
If you <u>do not have any Production deployed yet</u>, the steps are:</div>
<div>
<ol>
<li>Redeploy sandbox with target version. Make sure to select the upgraded application package. If you don't, you will have to install it afterwards, before you continue to the next step.</li>
<li>Import the upgraded bacpac <a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html" target="_blank">from step 3 in the series</a>. Here you can <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/import-database" target="_blank">use the tooling in LCS</a> if the database was uploaded into the Project Asset Library.</li>
<li>Validate!</li>
</ol>
</div>
</div>
<h2>
Production</h2>
<div>
When you have validated the sandbox, and you are ready to upgrade Production, <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-latest-update#upgrade-production" target="_blank">you will replay the steps you did in the sandbox, but this time in Production</a>. </div>
</div>
<br />
Good luck!tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-86769319298058598832018-11-01T00:23:00.000+01:002019-03-22T19:30:23.571+01:00Upgrade from 7.x to 8.+ series | Post 4 | Setup a new Build<h2>
Introduction</h2>
<b><i><span style="color: red;">[EDIT]: Changes since November 2018 has forced me to made some changes to these series. I will point out the changes.</span></i></b><br />
<b><i><br /></i></b>
In these series of posts. I will try to run you through the process of how you can complete the upgrade from 7.x to 8.+ of Dynamics 365 for Finance and Operations.<br />
<br />
Quick navigation:<br />
<a href="http://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-1.html">Upgrade from 7.x to 8.+ series | Post 1 | Start in LCS</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-2.html">Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html">Upgrade from 7.x to 8.+ series | Post 3 | Validate Code and Data in Dev</a><br />
<div>
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-4.html">Upgrade from 7.x to 8.+ series | Post 4 | Setup a new Build</a> (you are here)<br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-5.html">Upgrade from 7.x to 8.+ series | Post 4 | Upgrade Sandbox and finally Production</a></div>
<h2>
Some preparations before deploying Build VM</h2>
Basically, what we want to do is to have the new 8+ branch the build environment will pull code from. Beyond that you may want to have <b>additional development branch</b> to isolate ongoing development in the future, but I've left that out of the scope of this article.<br />
If you've read the previous posts, you know the Code Upgrade in LCS created a "release" branch folder with a prepared upgraded application, and given that you've completed the code upgrade and validation as mentioned in the previous post, you should now be able to copy the result over to a new main branch for 8+.<br />
<br />
The flow can be displayed sort of like this:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-0OMX3OjVhuI/W9lpO-R_X_I/AAAAAAACNVY/lfH53L1lpuc1L9d8Q3DfrOX7_l2lUYKOQCLcBGAs/s1600/branch_strategy.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="316" data-original-width="1567" height="128" src="https://3.bp.blogspot.com/-0OMX3OjVhuI/W9lpO-R_X_I/AAAAAAACNVY/lfH53L1lpuc1L9d8Q3DfrOX7_l2lUYKOQCLcBGAs/s640/branch_strategy.png" width="640" /></a></div>
<br />
Now, obviously you're most likely going to delete/remove the old main branch and possibly also the "release" branch in the future. But the flow above can still be achieved. There are many ways to actually do this, and some have very strong opinions on how to branch the source.<br />
<br />
You can easily create a new main branch by using the prepared "release" as source. You can do this using Source Code Explorer inside of Visual Studio running on your development VM.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-zHBwneZ5DmM/W9lsZ3SL-3I/AAAAAAACNVk/QnaFpj4a4BALpoCuu0ehl6jNn-V_Dat6gCLcBGAs/s1600/branch1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="306" data-original-width="1357" height="144" src="https://2.bp.blogspot.com/-zHBwneZ5DmM/W9lsZ3SL-3I/AAAAAAACNVk/QnaFpj4a4BALpoCuu0ehl6jNn-V_Dat6gCLcBGAs/s640/branch1.png" width="640" /></a></div>
<br />
<br />
You will simply give the new branch a unique name, separating it from the old main.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-5ahhPe0jCt8/W9lsx8qrGyI/AAAAAAACNVs/8NuAJW7TwdQUyX52ELBLUPmUKBsnNcRZACLcBGAs/s1600/branch2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="627" data-original-width="1176" height="339" src="https://1.bp.blogspot.com/-5ahhPe0jCt8/W9lsx8qrGyI/AAAAAAACNVs/8NuAJW7TwdQUyX52ELBLUPmUKBsnNcRZACLcBGAs/s640/branch2.png" width="640" /></a></div>
<br />
The name of the branch can actually be changed later, if that bothers you. However, we will deploy a Build environment later, and this will create a Build definition that needs the branch name to be correct - or the build definition will not work.<br />
<br />
Don't forget, your changes locally on the VM will need to be committed to Azure DevOps (VSTS).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-5r8tcd0aS2Q/W9luNl4PlaI/AAAAAAACNV4/1T7fJixfnsEsRHsEyadlHVlcvW4lYAqIwCLcBGAs/s1600/branch3.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="573" data-original-width="550" height="320" src="https://2.bp.blogspot.com/-5r8tcd0aS2Q/W9luNl4PlaI/AAAAAAACNV4/1T7fJixfnsEsRHsEyadlHVlcvW4lYAqIwCLcBGAs/s320/branch3.png" width="307" /></a></div>
<br />
<br />
Another thing we will want to do is to create ourselves an isolated Agent Pool in Azure DevOps (VSTS). We want to make sure only 8+ build agents are in this pool of agents. You will need at least one, but who knows if you will add more in the future.<br />
<br />
You will need some permissions in Azure DevOps (VSTS) to create this, but start at the <b>Organization </b>level and create a new Pool. I named it D365FO81 (since it will be used for 8.1.x). I have lots of projects not related to Dynamics, so I didn't want to push the agent pool to all projects.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-QusO2WnPeK0/W9luS2G62eI/AAAAAAACNV8/iHYFu04sEPo5pj2rV4T41t2YjVKKJ8OAgCLcBGAs/s1600/azuredevops_agent1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="466" data-original-width="987" height="188" src="https://3.bp.blogspot.com/-QusO2WnPeK0/W9luS2G62eI/AAAAAAACNV8/iHYFu04sEPo5pj2rV4T41t2YjVKKJ8OAgCLcBGAs/s400/azuredevops_agent1.png" width="400" /></a></div>
<br />
<br />
I then opened the <b>Project </b>itself and added the Agent Pool to the project.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-2UiiuqwINOI/W9luXZi3hkI/AAAAAAACNWA/xcnbkgkMn4QK1LK428sDRzdfaO4-zCRTACLcBGAs/s1600/azuredevops_agent2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="529" data-original-width="1003" height="210" src="https://1.bp.blogspot.com/-2UiiuqwINOI/W9luXZi3hkI/AAAAAAACNWA/xcnbkgkMn4QK1LK428sDRzdfaO4-zCRTACLcBGAs/s400/azuredevops_agent2.png" width="400" /></a></div>
<br />
<h2>
Deploy Build VM</h2>
<b><i><span style="color: red;">[EDIT]: This section has been edited.</span></i></b><br />
Now, we are ready to head back to LCS and deploy a Build VM. And with the preparation above, we can fill out the VSTS-part like this, and it will make sure to put the build agent on the right pool, plus make sure the deployed build defintion points to the right branch.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-iJhOzVdvBv0/W9lwRQWs1MI/AAAAAAACNWU/I887tIkE58A0NIvkM1gTu75fttKbzlTfgCLcBGAs/s1600/dev_topology.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="300" data-original-width="759" height="157" src="https://3.bp.blogspot.com/-iJhOzVdvBv0/W9lwRQWs1MI/AAAAAAACNWU/I887tIkE58A0NIvkM1gTu75fttKbzlTfgCLcBGAs/s400/dev_topology.png" width="400" /></a></div>
<br />
<br />
Select the correct topology, and if you're deploying this on a private/self hosted Azure Subscription, you can chose a setup with DS13v2 and 14 standard disks of size 64GB. Again, leaning on the community here to learn what they recommend. These things change over time, and I can't promise I'll get back to this post and update it.<br />
<br />
If you deleted the existing MS Hosted build environment, and deploy a new MS Hosted, you won't get any options to decide on VM size or disk setup.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-aymjQOtjvRg/W9lw5Nxnu6I/AAAAAAACNWc/U5RyeihmSocJyDFQJODRUlBYYsO_cOLNwCLcBGAs/s1600/build_topology.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="435" data-original-width="741" height="233" src="https://3.bp.blogspot.com/-aymjQOtjvRg/W9lw5Nxnu6I/AAAAAAACNWc/U5RyeihmSocJyDFQJODRUlBYYsO_cOLNwCLcBGAs/s400/build_topology.png" width="400" /></a></div>
<br />
<br />
Notice I fill in the name of the Agent Pool and the name of the branch. I also give the agent a unique name. It will take quite some time before the Build environment is up and running.<br />
<br />
<b><i><span style="color: red;">EDIT: Before you continue, go ahead and install the same update package installed on the Development environment <a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html" target="_blank">from step 3 in my series</a>. Installing this same "Platform and application binary package" on the Build environment will ensure the build is running on the exact same version, and the artifact created from the build pipleline is of the same version. The next time, in the future, you want to get more updates of the platform and application, you can create the update package through the Update tiles on the Build environment. </span></i></b><br />
<br />
When it is, you will go ahead and schedule a build on the new Build Definition. The job will be picked up by the right Agent Pool, and then picked up by the agent sitting on the Build VM.<br />
<br />
When the Build is complete, make sure to upload the Deployable Package to the LCS Asset Library. You will need it for <a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-5.html">the final post</a>.<br />
<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-67455783783462924452018-11-01T00:21:00.001+01:002019-03-22T14:33:04.134+01:00Upgrade from 7.x to 8.+ series | Post 3 | Validate Code and Data in Dev<h2>
Introduction</h2>
<b><i><span style="color: red;">[EDIT]: Changes since November 2018 has forced me to made some changes to these series. I will point out the changes.</span></i></b><br />
<br />
In these series of posts. I will try to run you through the process of how you can complete the upgrade from 7.x to 8.+ of Dynamics 365 for Finance and Operations.<br />
<br />
Quick navigation:<br />
<a href="http://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-1.html">Upgrade from 7.x to 8.+ series | Post 1 | Start in LCS</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-2.html">Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html">Upgrade from 7.x to 8.+ series | Post 3 | Validate Code and Data in Dev</a> (you are here)<br />
<div>
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-4.html">Upgrade from 7.x to 8.+ series | Post 4 | Setup a new Build</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-5.html">Upgrade from 7.x to 8.+ series | Post 4 | Upgrade Sandbox and finally Production</a><br />
<br /></div>
<h2>
Connect to code</h2>
<b><i><span style="color: red;">[EDIT]: This section has been edited.</span></i></b><br />
<b><i><span style="color: red;"><br /></span></i></b>
Given the code upgrade is completed in LCS, a process that shouldn't take many hours, and the Development VM is published, you can connect the local PackageLocalDirectory to the branch folder holding the "release".<br />
<br />
Open Visual Studio, Connect to the Azure DevOps (VSTS) account and the right project, and then map your workspace to the "release". Notice I point the Metadata folder under the release to my local PackageLocalDirectory.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-P6juNSNlVLo/W9ijORHJhnI/AAAAAAACNVA/vTUrZs16xs0J_wgY6F6PvYZzbfCJFnFhQCLcBGAs/s1600/dev_setup1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="125" data-original-width="720" height="110" src="https://4.bp.blogspot.com/-P6juNSNlVLo/W9ijORHJhnI/AAAAAAACNVA/vTUrZs16xs0J_wgY6F6PvYZzbfCJFnFhQCLcBGAs/s640/dev_setup1.png" width="640" /></a></div>
<br />
<br />
Let's have a quick look at the result from the Code Upgrade process. Like I wrote in the first post, the upgrade removes Microsoft hotfixes, but keeps any other custom packages and modules.<br />
<br />
Put another way, the code upgrade will first copy your source metadata, then remove Microsofts modules, and it will sort of look a little bit like this.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-DVumeS0BOSg/W9ij6RjLnHI/AAAAAAACNVI/EvbA3Nu3Dvk7039cthqH12DtdgZcQqFVACLcBGAs/s1600/codeupgrade3.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="446" data-original-width="1090" height="259" src="https://3.bp.blogspot.com/-DVumeS0BOSg/W9ij6RjLnHI/AAAAAAACNVI/EvbA3Nu3Dvk7039cthqH12DtdgZcQqFVACLcBGAs/s640/codeupgrade3.png" width="640" /></a></div>
<br />
<br />
If you were to take one of your existing development VMs and connect to the "release" branch folder and run a "Get Latest", the exact same steps would happen on your machine; you would see all the Microsoft Standard Module files be deleted under your PackageLocalDirectory. DON'T DO IT!<br />
<br />
You may wonder why that doesn't happen on the new development VM. Well, since the Workspace you have just created on the new VM was created after the cleanup of the upgraded branch, nothing gets deleted locally when you run "Get Latest" on the new "release" branch folder.<br />
<br />
So next you basically will have to make sure your application builds and works as expected - before you can continue.<br />
<br />
<b><i><span style="color: red;">EDIT: I would recommend using this opportunity to look at the update tiles on this Development environment, and then take the updates of standard <u>now</u>. This process will create a "Platform and application binary package" (also referred to as "plat+app"). Install this update on the Development environment now, and plan on using this same update package through out the process, on the build environment and the sandbox, all the way to production. The exception is if you find that you need to take a new and even more up-to-date package. This package will contain both platform updates AND application updates. When you install it on the Development environment, you will get the latest source code on the box as well.</span></i></b><br />
<br />
<h2>
Upgrade the Data </h2>
<b><i><span style="color: red;">[EDIT]: This section has been edited.</span></i></b><br />
<span style="color: red;"><b><i><br /></i></b>
<b><i>Assuming you <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/export-database" target="_blank">Exported the database</a> from the sandbox using LCS, you can download the bacpac directly from the project asset library. Currently there is no solution in LCS to <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/import-database" target="_blank">Import a database</a>, but <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/copy-operations-database#import-the-finance-and-operations-database" target="_blank">there is a guide on docs for the steps involved importing a bacpac manually</a>. Using D365fo.tools below, you can do the same manual import with ease. Just skip the download part of my original post below, but execute the Import part. </i></b></span><br />
<b><i><br /></i></b>
When you application is 8.+, you can go ahead and get the 7.x database and upgrade it on this development environment. This process should reveal any possible technical issues of sorts.<br />
<br />
Let's first download the database to the VM from the cloud storage mentioned on post 2. You can either use Microsoft Azure Explorer or use the <a href="https://www.powershellgallery.com/packages/d365fo.tools">community driven PowerShell library d365fo.tools</a>, like this.<br />
<br />
<pre class="brush:powershell;">Import-Module d365fo.tools
$dbsettings = Get-D365DatabaseAccess
$params = @{
AccountID = 'STORAGE_ACCOUNT_NAME'
AccessToken = 'LONG_AND_SECRET_ACCESS_KEY_FOUND_ON_THE_STORAGE_ACCOUNT_IN_THE_AZURE_PORTAL'
Blobname = 'NAME_OF_THE_BLOB'
Path = 'D:\Backup\'
FileName = 'sandbox_adhoc.bacpac'
}
Invoke-D365AzureStorageDownload @params
</pre>
<br />
<b><i><span style="color: red;">[EDIT]: After downloading the bacpac from the Project Asset Library, you can start the import steps below.</span></i></b><br />
<br />
With the database extract (bacpac), you will have to import it, overwriting the existing AxDB. There are a few gotchas when doing this, and you can either do it manually, following the guide on docs, or you can again use the <a href="https://www.powershellgallery.com/packages/d365fo.tools">PowerShell library d365.tools</a> to help you out:<br />
<br />
<pre class="brush:powershell;">Import-Module d365fo.tools
$bacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
$sourceDatabaseName = "AxDB_Source_$(Get-Date -UFormat "%y%m%d%H%M")"
#Remove any old temp source DB
Remove-D365Database -DatabaseName $sourceDatabaseName -Verbose
#Stop local environment components
Stop-D365Environment -All
# Import the bacpac to local SQL Server
Import-D365Bacpac -ImportModeTier1 -BacpacFile $bacpacFile -NewDatabaseName $sourceDatabaseName -Verbose
#Remove any old AxDB backup (if exists)
Remove-D365Database -DatabaseName 'AxDB_original' -Verbose
#Switch AxDB with source DB
Switch-D365ActiveDatabase -DatabaseName 'AxDB' -NewDatabaseName $sourceDatabaseName -Verbose
Start-D365Environment -All
</pre>
<br />
The script above does several things, like importing the bacpac and replacing the existing AxDB with the imported database. The whole process may take quite some time, because the bacpac import is a slow process. Also, the actual mdf and ldf file for the AxDB will have a date and timestamp, making it unique for each time you import - if you need to do it more than once.<br />
<br />
When the database is imported, you will need to head back to LCS and apply the Software Deployable Package created by Microsoft specifically for doing the DataUpgrade. This process will also take some time, but at the end of it, you will have an upgraded database. The package is named DataUpgrade-8-1 and if you look at its description, it is one single package that upgrades the database from any previous 7.x version to 8.1.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-OJY9BqeN3rE/W9oXH66b4yI/AAAAAAACNWo/z-HWW_xAsfsM5Z2AOfbLy2bfPaJhdZv-wCLcBGAs/s1600/dataupgrade.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="667" data-original-width="1600" height="266" src="https://3.bp.blogspot.com/-OJY9BqeN3rE/W9oXH66b4yI/AAAAAAACNWo/z-HWW_xAsfsM5Z2AOfbLy2bfPaJhdZv-wCLcBGAs/s640/dataupgrade.png" width="640" /></a></div>
<br />
<br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-4.html">In the next post</a>, I will show one possible way to prepare your new build for 8+, which is a necessity before you can continue with updating your Sandbox and later your Production.<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-18797596061501474462018-11-01T00:19:00.000+01:002019-03-22T14:05:13.550+01:00Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB<h2>
Introduction</h2>
<b><i><span style="color: red;">[EDIT]: Changes since November 2018 has forced me to made some changes to these series. I will point out the changes.</span></i></b><b style="background-color: white; color: #333333; font-family: Arial, Tahoma, Helvetica, FreeSans, sans-serif; font-size: 13px;"><i><br /></i></b><br />
In these series of posts. I will try to run you through the process of how you can complete the upgrade from 7.x to 8.+ of Dynamics 365 for Finance and Operations.<br />
<br />
Quick navigation:<br />
<a href="http://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-1.html">Upgrade from 7.x to 8.+ series | Post 1 | Start in LCS</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-2.html">Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB</a> (you are here)<br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html">Upgrade from 7.x to 8.+ series | Post 3 | Validate Code and Data in Dev</a><br />
<div>
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-4.html">Upgrade from 7.x to 8.+ series | Post 4 | Setup a new Build</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-5.html">Upgrade from 7.x to 8.+ series | Post 4 | Upgrade Sandbox and finally Production</a><br />
<br /></div>
<h2>
Deploy 8.x environments</h2>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-7kAiv_7y2M0/W9icrhX3HZI/AAAAAAACNU0/d5InrKccNZkbRgJbq5m3iePEG08iCua4QCLcBGAs/s1600/dev_topology.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="271" data-original-width="776" height="138" src="https://4.bp.blogspot.com/-7kAiv_7y2M0/W9icrhX3HZI/AAAAAAACNU0/d5InrKccNZkbRgJbq5m3iePEG08iCua4QCLcBGAs/s400/dev_topology.png" width="400" /></a></div>
<br />
<div style="text-align: center;">
<i>Choose the version <b><u>closest</u></b> to the target version you are aiming for.</i></div>
<br />
You will need to deploy new environments. There is no "in-place" upgrade of your existing environments. The new environments will be on the version you are upgrading to. Fortunately, deploying new environments is easy to do through Lifecycle Services (LCS). You will need a decent Development VM to connect to the upgraded metadata, and check the code for any issues.<br />
<br />
I typically go for a DS13v2 which has local SSD. I normally give it 14 disks of size 48GB, which will all be striped for maximum throughput. This has served me well so far. I don't chose premium storage, but go for standard storage. There are probably lots of preference out there, and I'm more than willing to learn from the community what they recommend.<br />
<br />
Make sure the VM is hosted on your own (or customers) Azure Subscription. This way you are guaranteed to get local admin user. Also make sure the topology is Development. Pick an empty database, as you won't need that Contoso data for what we're about to do.<br />
<br />
<h2>
Prepare Database</h2>
<b><i><span style="color: red;">[EDIT]: This section has been edited.</span></i></b><br />
<span style="color: red;"><b><i><br /></i></b>
<b><i>You can simply <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/export-database" target="_blank">Export the database</a> through LCS portal. LCS will create a bacpac of the sandbox database and save it in the project Asset Library. You should consider <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/database-refresh" target="_blank">Refresh the sandbox with a fresh copy of the Production database</a>. All of these steps are now easily done directly in the LCS portal.</i></b></span><br />
<br />
While the Development VM is deploying, here's another neat thing you can do, if you haven't already done so. Setup a cloud Storage Account in the Azure Subscription. It can be a cheap Standard Storage (general purpose v2) type, with only Local Redundancy, on a Cold Tier - nothing fancy. Create yourself a blob storage where you can put the database which you will get from your source environment. If you haven't done this in Azure Portal before, let this be your first time. Things to consider; the Storage Account name must be unique (for that specific Azure Region). But you're a good citizen, and always used a good naming practice, right?<br />
<br />
You will need three things from this cloud storage:<br />
<ol>
<li>The Storage Account Name</li>
<li>The name of the blob storage</li>
<li>The Access Key (which is found on the Azure Blade - look for the yellow key icon).</li>
</ol>
When you have the storage account ready, I bet the deploying of the Development VM is still spinning, so let's prepare a backup of the source database. We will use it to validate the upgrade. This is just a test, to make sure the upgrade experience will be smooth.<br />
<br />
Head over to your Sandbox (Tier2) AOS, and extract the database from there. If you want to test on a fresh copy from Production, you will have to <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/database-refresh">get Microsoft to do a Database Refresh</a> first. But let's assume the one on the Sandbox is fresh enough.<br />
<br />
The possibly quickest and easiest way to get the database extracted, at the point of writing, and while we are waiting on Microsoft to get the tooling in place in LCS, is to use the <a href="https://www.powershellgallery.com/packages/d365fo.tools">community driven D365FO.tools PowerShell Library</a>.<br />
<br />
Install the library on the AOS server using the following command. You'll have to click "Yes" and "Yes to all" on any questions.<br />
<br />
<pre class="brush:powershell;">Install-Module d365fo.tools</pre>
<br />
Then run the following to extract the database. It basically prepares a bacpac where all the post-SQL are run, and saves it do the D-drive.<br />
<br />
<pre class="brush:powershell;">Import-Module d365fo.tools
$dbsettings = Get-D365DatabaseAccess
$baseParams = @{
DatabaseServer = $dbsettings.DbServer
SqlUser = 'sqladmin'
SqlPwd = 'SQLADMIN_PASSWORD_FROM_LCS'
Verbose = $true
}
$params = $baseParams + @{
ExportModeTier2 = $true
DatabaseName = $dbsettings.Database
NewDatabaseName = $($dbsettings.Database + '_adhoc')
BacpacFile = 'D:\Backup\sandbox_adhoc.bacpac'
}
Remove-D365Database @baseParams -DatabaseName $($params.NewDatabaseName)
New-D365Bacpac @params
</pre>
<br />
Then using the cloud storage you've hopefully prepared, lets upload the bacpac to the cloud. We will later download it to the development VM.<br />
<br />
<pre class="brush:powershell;">Import-Module d365fo.tools
$params = @{
AccountID = 'STORAGE_ACCOUNT_NAME'
AccessToken = 'LONG_AND_SECRET_ACCESS_KEY_FOUND_ON_THE_STORAGE_ACCOUNT_IN_THE_AZURE_PORTAL'
Blobname = 'NAME_OF_THE_BLOB'
FilePath = 'D:\Backup\sandbox_adhoc.bacpac'
DeleteOnUpload = $false
}
Invoke-D365AzureStorageUpload @params
</pre>
<br />
The database extract in form of a bacpac now awaits in the cloud storage, and when the development VM is ready, you can use the same PowerShell Library to download it and install it on your development VM.<br />
<br />
But first, you need to make sure the application actually builds. I will address that <a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html">in the next post</a>.<br />
<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-26696692053588632732018-11-01T00:16:00.000+01:002019-03-22T14:04:33.694+01:00Upgrade from 7.x to 8.+ series | Post 1 | Start in LCS<h2>
Introduction</h2>
<b><i><span style="color: red;">[EDIT]: Changes since November 2018 has forced me to made some changes to these series. I will point out the changes.</span></i></b><br />
<br />
In these series of posts. I will try to run you through the process of how you can complete the upgrade from 7.x to 8.+ of Dynamics 365 for Finance and Operations.<br />
<div>
<br /></div>
Quick navigation:<br />
<a href="http://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-1.html">Upgrade from 7.x to 8.+ series | Post 1 | Start in LCS</a> (you are here)<br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-2.html">Upgrade from 7.x to 8.+ series | Post 2 | Deploy Dev and Grab source DB</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-3.html">Upgrade from 7.x to 8.+ series | Post 3 | Validate Code and Data in Dev</a><br />
<div>
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-4.html">Upgrade from 7.x to 8.+ series | Post 4 | Setup a new Build</a><br />
<a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-5.html">Upgrade from 7.x to 8.+ series | Post 4 | Upgrade Sandbox and finally Production</a></div>
<br />
The details for <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-latest-update">this upgrade is already very much detailed already on docs</a>, so this article is just "another" way of taking you through the steps. I will focus on the simplest example possible. I know "your miles may vary" if you have a more complex environment you need to upgrade. Things like over-layered code, dependencies to other systems through integrations, or your solution has third party solutions added, which may not be ready for upgrade. But let's assume for the sake of simplicity that you're on 7.x without any over-layering and you want to get on version 8 with as little fuzz as possible.<br />
<br />
In fact, if you do not have any over-layered code, and all extensions are compatible with 8+, this part will take less time than it will take you reading the post.<br />
<br />
<h2>
Code upgrade </h2>
Before you begin the code upgrade, you need to make sure a "magic" file is created on your repository in Azure DevOps. The file is not created by any other process (I know of). The file holds the value of the version you are upgrading <b>from</b>, and it needs to be named "ax7.version" and must sit in the Trunk/Main folder. There are a couple of ways to get the file created, but one very simple and pragmatic way is to simply create the file in the repository through the browser. Open the repository and navigate to Trunk/Main and create a new file directly.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
</div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-txIwPkYHM4o/W9df3Q9zrCI/AAAAAAACNUE/n2TqcKuauswooGGwgEdd6PCGK8tTCGv8wCLcBGAs/s1600/upgrade_axversion1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="505" data-original-width="1071" height="186" src="https://1.bp.blogspot.com/-txIwPkYHM4o/W9df3Q9zrCI/AAAAAAACNUE/n2TqcKuauswooGGwgEdd6PCGK8tTCGv8wCLcBGAs/s400/upgrade_axversion1.png" width="400" /></a></div>
<br />
<br />
You need to fill the file with the version number of your source. The <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-latest-update#scenario-2-upgrade-your-custom-code-1">version number can be found in docs</a>, but allow me to list a few of them here:<br />
<br />
7.3.11971.56116<br />
7.2.11792.56024<br />
7.1.1541.3036<br />
<br />
In the example below I am upgrading from 7.3.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-s4YzCenliA0/W9df3T-ylfI/AAAAAAACNUA/bG3oyYzv5VwmTT2Z3Gx7PYwhN2QZAHKogCEwYBhgL/s1600/upgrade_axversion2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="226" data-original-width="952" height="148" src="https://2.bp.blogspot.com/-s4YzCenliA0/W9df3T-ylfI/AAAAAAACNUA/bG3oyYzv5VwmTT2Z3Gx7PYwhN2QZAHKogCEwYBhgL/s640/upgrade_axversion2.png" width="640" /></a></div>
<br />
The file should be placed on the same level as the Metadata-folder.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-bqgRUkc-ZjE/W9df3aK1ZPI/AAAAAAACNUI/z2YwJyZGhDIFMhbAozZ4gQOzGu-VO_LwwCEwYBhgL/s1600/upgrade_axversion3.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="379" data-original-width="608" height="199" src="https://1.bp.blogspot.com/-bqgRUkc-ZjE/W9df3aK1ZPI/AAAAAAACNUI/z2YwJyZGhDIFMhbAozZ4gQOzGu-VO_LwwCEwYBhgL/s320/upgrade_axversion3.png" width="320" /></a></div>
<br />
When the file is created, you can go ahead and run the code upgrade in LCS.<br />
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-g65fiNBJGKE/W9dm_HWRmFI/AAAAAAACNUY/RyQFiMxRIPQJCP_VQLVeJdUsyZvoGlpYgCLcBGAs/s1600/codeupgrade1.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="716" data-original-width="637" height="400" src="https://2.bp.blogspot.com/-g65fiNBJGKE/W9dm_HWRmFI/AAAAAAACNUY/RyQFiMxRIPQJCP_VQLVeJdUsyZvoGlpYgCLcBGAs/s400/codeupgrade1.png" width="355" /></a></div>
<br />
<br />
The code upgrade tool does a couple of things. You will get some reports and information of what the process discovers of work to do, based off whatever you have in your Trunk/Main. But the important thing it does, is it will create a new folder in your repository, and this folder will contain the upgraded version of your code.<br />
That is right; it copies whatever is in Trunk/Main, puts it into another folder, and does a couple of additional things, like removing all the hotfixes you may have previously added to Trunk/Main.<br />
Why? Because you're going to a new version, and those old hotfixes either already exists in the target version, or you will need to reapply them using updates created specifically for the version you are going to. In fact, if you run the Code Upgrade process again and again and again, you will end up with as many copies as you run the tool. Don't worry, you're permitted to delete copies you do not want to keep.<br />
<br />
Oh, and the copy (or copies) are also marked as a branch from Trunk/Main. You can see this of you check the folder Properties from either the Releases or from Trunk/Main.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-z9AYUV_ZmE0/W9eGPLs8PPI/AAAAAAACNUo/ejQ6ufDsnvIAJreFrZtDCRnnNdIxjb21gCLcBGAs/s1600/codeupgrade2.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="373" data-original-width="1483" height="160" src="https://3.bp.blogspot.com/-z9AYUV_ZmE0/W9eGPLs8PPI/AAAAAAACNUo/ejQ6ufDsnvIAJreFrZtDCRnnNdIxjb21gCLcBGAs/s640/codeupgrade2.png" width="640" /></a></div>
<br />
<br />
Do you really need to run the code upgrade? Well, you could create a 8+ branch yourself and merge in the modules you're keeping to your 8+ branch, and then try build and resolve any issues. But consider that the code upgrade tool does that for you, gives you some details on what it finds. You don't have to do much beyond the steps out lined above. Time saved, and cost saved.<br />
<br />
Actually, while LCS analyzes your code, you can start on the next step in the process, and I will talk about those <a href="https://yetanotherdynamicsaxblog.blogspot.com/2018/11/upgrade-from-7x-to-8-series-post-2.html">in the next post</a>.<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-22227009461490268322018-08-06T13:07:00.001+02:002018-08-06T13:07:06.476+02:00Servicing fails on step 6 while updating AOSThere are some hotfixes that patch modules and packages which are only available on the "onebox" sandbox (Tier 1) environments. If you happen to add these hotfixes to your VSTS Main Branch, you will most likely end up trying to install these modules and packages on your Tier 2 (ie UAT) environments, and this deployment will most likely fail. The reason is now the package has references to binaries which are not present on Tier 2.<br />
<br />
One example is the module DemoDataSuite. From the deployment log you will find the following statement:<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">Running command: C:\DynamicsTools\nuget.exe install -OutputDirectory "G:\AosService\PackagesLocalDirectory\InstallationRecords" dynamicsax-demodatasuite -Source G:\DeployablePackages\GUID\AOSService\Packages</span><br />
<br />
From the output, you would then find the following:<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">The running command stopped because the preference variable "ErrorActionPreference" or common parameter is set to Stop: Unable to resolve dependency 'dynamicsax-applicationfoundationformadaptor'.</span><br />
<div>
<br /></div>
<div>
It's true the DemoDataSuite depends on ApplicationFoundationFormAdaptor, and this Formadaptor Module is not on the Tier 2 environment. </div>
<div>
<br /></div>
<div>
A simple solution is to simply change the default variables for the build defintion, and make sure the DemoDataSuite is excluded from package generation. </div>
<div>
<br /></div>
<div>
Instructions on how to exclude named packages from the build can be found here:</div>
<div>
<a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/dev-tools/exclude-test-packages">https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/dev-tools/exclude-test-packages</a></div>
<div>
<br /></div>
<div>
If you suspect you've made custom modules and packages, and are worried they are causing your servicing to fail similarly, you may want to check the references yourself. Have a look at this post for more information on how to do that:</div>
<div>
<a href="https://blogs.msdn.microsoft.com/axsupport/2017/04/10/resolving-a-missing-reference-during-package-deployment/">https://blogs.msdn.microsoft.com/axsupport/2017/04/10/resolving-a-missing-reference-during-package-deployment/</a></div>
tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-50760833651912665032018-06-15T22:36:00.002+02:002018-06-19T15:19:40.496+02:00How do you apply the latest updates on Dynamics 365 Finance and OperationsWhen you deploy and environment of Dynamics 365 for Finance and Operations, you are asked to pick a version of the application along with the platform version. At the point of writing, the latest application version is 8.0 and platform version is 18. We know application version 8.0 and onwards to not allow for any overlayering of standard code. We also know that version 7.3 (the version prior to 8.0) allows overlayering.<br />
<br />
If you choose to publish 7.3, you will get the application is it was in December 2017, and you will have to go through the process of applying a fair number of hotfixes to get your application updated.<br />
<br />
In this post I will address this process. It is <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-latest-update">fairly well documented on docs</a>, but I suppose it helps to read it from various sources. This post focus on the minimum effort needed to get updated. Depending on your scenario, it might be more complicated, if you for example have customizations, including extensions.<br />
<br />
<b>Before you leave this page, I should tell you there is a bonus part at the end of it.</b><br />
<br />
This process is in two parts:<br />
<ol>
<li>Binary Updates</li>
<li>Application Updates (or X++ Updates if you like)</li>
</ol>
<h2>
Binary Updates</h2>
This part can actually be done by a non-developer. It is fairly easy to complete, and should be safe.<br />
<br />
I start with updating a DEV environment, assuming it is aligned to the remaining environments (STAGE, PROD) when considering updates. From the environment page in LCS, I can tell there are lots of updates waiting to be installed.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-VYk0muWtiBk/WyQfrQFaVNI/AAAAAAACLo0/ViPDtOHEWzUZrZrt-wDuiAcUyWr6REkpACLcBGAs/s1600/update_tiles_pre.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="395" data-original-width="335" height="400" src="https://4.bp.blogspot.com/-VYk0muWtiBk/WyQfrQFaVNI/AAAAAAACLo0/ViPDtOHEWzUZrZrt-wDuiAcUyWr6REkpACLcBGAs/s400/update_tiles_pre.png" width="338" /></a></div>
<br />
<br />
I start by opening All Binary updates. Notice they are all already marked for download. You can't cherry pick these updates, you get them all. I could take only the platform updates instead, but I want everything updated, hence "All Binary updates".<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-glrPo-VVYmQ/WyQfzzdUzzI/AAAAAAACLo4/vLmAEqhC9MYRxgucKRekv0NL1brLt4icACLcBGAs/s1600/binary_updates_select.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="377" data-original-width="972" height="248" src="https://3.bp.blogspot.com/-glrPo-VVYmQ/WyQfzzdUzzI/AAAAAAACLo4/vLmAEqhC9MYRxgucKRekv0NL1brLt4icACLcBGAs/s640/binary_updates_select.png" width="640" /></a></div>
<br />
<br />
When you continue, notice that you do not actually download the update, rather it is saved back to asset library.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-714HLRVIWTU/WyQf_Epw6sI/AAAAAAACLpA/ygQNrmNDkf8ndQQHH0yE8AczvTINnD4swCLcBGAs/s1600/binary_create_package.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="448" data-original-width="354" height="400" src="https://3.bp.blogspot.com/-714HLRVIWTU/WyQf_Epw6sI/AAAAAAACLpA/ygQNrmNDkf8ndQQHH0yE8AczvTINnD4swCLcBGAs/s400/binary_create_package.png" width="315" /></a></div>
<br />
<br />
This may take a while, as the entire thing is a couple of GB in total. Allow Asset Library analyze the package before you continue.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-OPK5DsDx0lY/WyQgX-wd5FI/AAAAAAACLpQ/SoZspArOC04Cu4Pp-yuclIiMgfj8fsiQwCLcBGAs/s1600/binary_updates_assetlib.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="296" data-original-width="740" height="256" src="https://4.bp.blogspot.com/-OPK5DsDx0lY/WyQgX-wd5FI/AAAAAAACLpQ/SoZspArOC04Cu4Pp-yuclIiMgfj8fsiQwCLcBGAs/s640/binary_updates_assetlib.png" width="640" /></a></div>
<br />
<br />
When the package is ready, you can go ahead and run "Maintain" and "Apply Updates" form the environment page. Pick the Binary Update package and allow for the Runbook to install it.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-h_Xh0btOnac/WyQivh4oupI/AAAAAAACLp0/updxr6DBSus6tzVrUauWSmQ8tnlSAMDLwCLcBGAs/s1600/apply_updates.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="106" data-original-width="458" height="147" src="https://1.bp.blogspot.com/-h_Xh0btOnac/WyQivh4oupI/AAAAAAACLp0/updxr6DBSus6tzVrUauWSmQ8tnlSAMDLwCLcBGAs/s640/apply_updates.png" width="640" /></a></div>
<br />
<br />
<i>NOTE! This process will seed the package to your environment. Make sure you have enough space available on your Service Volume. You also want to make sure nobody is running Visual Studio on the machine while it is serviced. If your VM runs with standard disks (HDD) instead of premium storage (SSD), then the copying of files may time out. If that happens, just press the "Resume" button on the environment page. You might also notice that the machine even reboots as part of the process.</i><br />
<br />
After the Binary Updates are installed, the tiles should hopefully report this.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-9zL2Y2seuwU/WyQgGvbHkJI/AAAAAAACLpI/uMQobNEP97IGNYb9NGCHJlXs7VUaw5K8ACLcBGAs/s1600/binary_updates_complete.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="191" data-original-width="305" height="250" src="https://2.bp.blogspot.com/-9zL2Y2seuwU/WyQgGvbHkJI/AAAAAAACLpI/uMQobNEP97IGNYb9NGCHJlXs7VUaw5K8ACLcBGAs/s400/binary_updates_complete.png" width="400" /></a></div>
<br />
<h2>
Application Updates</h2>
The next process is a bit more technical and needs the attention of someone with a developer role.<br />
<br />
Start by opening the Application Updates, not just the Critical Updates, but all of them. You will click "Select all" and press "Add". This will mark all of them for download.<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-tbmQlS6iGhk/WyQgjaPFpOI/AAAAAAACLpU/zTXt9TdH_3scnKyJ9Qi4LKP2ZWok88e8gCLcBGAs/s1600/app_updates_select.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="283" data-original-width="794" height="227" src="https://1.bp.blogspot.com/-tbmQlS6iGhk/WyQgjaPFpOI/AAAAAAACLpU/zTXt9TdH_3scnKyJ9Qi4LKP2ZWok88e8gCLcBGAs/s640/app_updates_select.png" width="640" /></a></div>
<br />
Since I had over one thousand KBs, it took several seconds for LCS to create the download, so I simply had to wait for it to be sent to the browser for download. It is not a big file. In my example all updates were around 80MB.<br />
<br />
The file is a Package.zip, so you will have to unblock it and unzip the file to get the actual HotfixPackageBundle.axscdppck file. That is a nice and long file extension, which I can only assume means AX Source Code Deployable Package. ;-)<br />
<br />
<i>Tip! Did you know the file is actually a compressed file using zip? If you change the file extension to zip and unpack it, you will see all the packages and a manifest defining any dependencies between them. If you even take one of the single packages out, change the file extension to zip, you can get the details of that package. What files will be changed and how elements will be changed, and also what KB numbers are covered by that package. </i><br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-HMApEqrdE9I/WyQhIGXKKbI/AAAAAAACLpo/NYJ3rsxVdH4FfCUeC31V0OwX7YZiAwByQCLcBGAs/s1600/hotfixbundle.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="134" data-original-width="689" height="124" src="https://2.bp.blogspot.com/-HMApEqrdE9I/WyQhIGXKKbI/AAAAAAACLpo/NYJ3rsxVdH4FfCUeC31V0OwX7YZiAwByQCLcBGAs/s640/hotfixbundle.png" width="640" /></a></div>
<br />
<br />
Now, here comes the tricky part. While it is possible to apply this package through Visual Studio, I have found it safer to do the next part <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/install-metadata-hotfix-package">using command line util (SCDPBundleInstall.exe)</a>. Furthermore, I also ensure that there are no service or application potentially locking any files in the Package Local Directory. That means, no Visual Studio running, no IIS running and no Dynamics Ax Batch running. <b>Have a look at the script I have shared at the end of this article.</b><br />
<br />
The process is basically split in two steps:<br />
<br />
<ol>
<li><b>Prepare. </b>This process analyze the content of the package and makes sure all files which will be changed in the Package Local Directory are put in source control (VSTS). That means add and edit commands, ensuring we can put them into VSTS if we mess up and need to go back to how things were before installing the updates. </li>
<li><b>Install</b>. This process analyze the content of the package and actually change files in the Package Local Directory. Any files added or removed will also be put in the list of pending changes to source control (VSTS).</li>
</ol>
<br />
You cannot run them in one single operation, you need to run <b>Prepare </b>first, then commit the pending changes to VSTS. Then you run it again with the <b>Install </b>mode to change files.<br />
<br />
The prepare step takes less time, but it does take time. If you want to see what it is actually doing, the closest you get is having a look under your users Temp folder. It will extract the packages and the dependency manifest under C:\Users\USERNAME\AppData\Temp\SCDPBundleInstall. You will observe the tool is extracting each package, looking at the manifest of the package, and checking what files the package will change. As part of this process, it also ensures the change is added to "pending changes". When the tool is done, it removes the folder.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-I9W2esaaMH4/WyQgwEjnANI/AAAAAAACLpc/_DOYCbuhQwoMO9h9qev8qkqjjEH9FtVSgCLcBGAs/s1600/bundleinstall_temp.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="156" data-original-width="698" height="142" src="https://4.bp.blogspot.com/-I9W2esaaMH4/WyQgwEjnANI/AAAAAAACLpc/_DOYCbuhQwoMO9h9qev8qkqjjEH9FtVSgCLcBGAs/s640/bundleinstall_temp.png" width="640" /></a></div>
<br />
<br />
When prepare is complete, you will have to open Visual Studio and commit the pending changes in order to "backup" the files to source control. Give this commit a good name, so you know it is related to preparing a hotfix bundle install. Starting Visual Studio will start IIS Express, and since you will close Visual Studio when your changes are committed, you will again have to ensure IIS is stopped before you run the install step.<br />
Remember, when Visual Studio is closed, IIS Express is normally stopped and IIS is started. This process can take a couple of seconds, so wait a few seconds before you continue with the install step.<br />
<br />
The install step takes the longest time, but also uses the same folder to extract and analyze the packages.<br />
<br />
Before you go ahead and commit all the updates standard modules to VSTS from DEV, you will need to make sure it builds. Your customizations may be broken, and your overlayering may have new conflicts that needs to be resolved. All of this must be handled before the application updates are put in source control (VSTS).<br />
<br />
From there, you initiate the BUILD, take out the final artifact containing all the updated application modules (packages) and put it up in Asset Library.<br />
<br />
When you are ready to install in STAGE, start with the Binary Updates package in Asset Library, then install the Application Update package in Asset Library.<br />
<h2>
Some potential troubleshooting hints</h2>
I did run into some issues while doing this. All of which I had to resolve manually, and some were reported back to Microsoft Support.<br />
<br />
You might experience delays when applying the application updates due to limitations of how many transactions you are allowed to do against VSTS. These are delays, so it should only mean things takes longer.<br />
<br />
If you get errors while running the prepare or install, it might be something wrong with one of the packages, either due to invalid manifest or dependencies. I've only seen this a few times, and it is not expected. If that happens, contact Microsoft Support.<br />
<br />
Also be aware that if one module fails to build for whatever reason, all modules depending on that module will most likely also throw errors. So don't fall off your chair if you get a high number of compilation errors. It might just be one error, creating a chain or other errors. Solve that one error, and the other go away.<br />
<h2>
Bonus</h2>
Using the SCDPBundleInstall tool is documented on docs, but for your convenience I will share a little PowerShell script that helps you run the prepare and install step. If you see any errors or improvements, I am grateful for all feedback.<br />
<br />
<pre class="brush:powershell;">function InstallHotfixBundle ([string] $file, [string]$vstsAccountName, [bool] $installMode = $false)
{
$VSTSURI = 'https://{0}.visualstudio.com' -f $vstsAccountName
# PLD is normally on C, J or K drive
$pldPath = "\AOSService\PackagesLocalDirectory"
$packageDirectory = "{0}:$pldPath" -f (('J','K')[$(Test-Path $("K:$pldPath"))],'C')[$(Test-Path $("C:$pldPath"))]
$command = ('prepare','install')[$installMode]
if ($installMode -eq $true)
{
Write-Host "INSTALL MODE!" -f Yellow
Get-Service w3svc | Stop-Service -Force
Get-Service DynamicsAxBatch | Stop-Service -Force
}
else
{
Write-Host "PREPAREMODE ONLY!" -f Yellow
}
if (Test-Path -Path $file)
{
Unblock-File -Path $file
$InstallUtility = '{0}\Bin\SCDPBundleInstall.exe' -f $packageDirectory
$params = @(
'-{0}' -f $command
'-packagepath={0}' -f $file
'-metadatastorepath={0}' -f $packageDirectory
'-tfsworkspacepath={0}' -f $packageDirectory
'-tfsprojecturi={0}/defaultcollection' -f $VSTSURI
)
& $InstallUtility $params 2>&1 | Out-String
if ($installMode -eq $true)
{
Write-Host "Hotfixes have been applied. Verify through build & sync, and commit the updates to VSTS!" -f Green
}
else
{
Write-Host "Touched elements ready for backup to VSTS. Commit changes before continue with install! Remember to close Visual Studio when you are done!" -f Green
}
}
else
{
throw 'No such file {0}' -f $file
}
}
# Remove the # to uncomment the line you want to run
#InstallHotfixBundle -file 'D:\Hotfix\HotfixPackageBundle.axscdppkg' -vstsAccountName 'YOUR_VSTS_ACCOUNT'
#InstallHotfixBundle -file 'D:\Hotfix\HotfixPackageBundle.axscdppkg' -vstsAccountName 'YOUR_VSTS_ACCOUNT' -installMode $true
</pre>
<br />
Enjoy!tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-72784506557957597642018-02-04T20:22:00.001+01:002018-02-04T20:46:08.254+01:00Installing a Software Deployable Package (SDP) using PowerShellNow the PowerShell involved here is miniscule, so don't expect much. But I'm going to post this either way.<br />
<br />
You will most likely install Software Deployable Packages using LCS, <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/deployment/apply-deployable-package-system" target="_blank">as outlined on the official docs</a>, so why would you need a PowerShell script for this? It so happens that you need to install the package manually if you for example need to upgrade from 7.2 to 7.3 of Operations.<br />
<br />
You download the package from LCS. After unblocking the zip-file, and extract it somewhere. I typically extract it on the Temporary Drive, the D-drive. Then you simply need to run this small script to initiate the installation locally.<br />
<br />
<pre class="brush: powershell;">#Requires -RunAsAdministrator
function InstallSDP()
{
$BinaryPackageLocation = 'D:\Update'
$Installer = $('{0}\AXUpdateInstaller.exe' -f $BinaryPackageLocation)
if (Test-Path -Path $Installer)
{
Set-Location $BinaryPackageLocation
& $Installer 'quickinstallall' 2>&1 | Out-String
}
else
{
Write-Output $("No update found in {0}" -f $BinaryPackageLocation)
}
}
InstallSDP
</pre>
<br />
Now, this will not work unless you have local admin rights. So the yes, that means if you plan to run the 7.2 to 7.3 upgrade, you need to run it on a machine where you have local admin rights. This is pointed out in <a href="https://community.dynamics.com/ax/b/newdynamicsax/archive/2018/01/06/restricted-admin-access-on-development-vms-with-platform-update-12-what-you-need-to-know" target="_blank">question 14 on Robert Badawys FAQ on the matter</a>.<br />
<br />
Notice I am using the "quickinstallall" command here, and this is only applicable for OneBox Developer VMs.<br />
<br />
So what about "devinstall"-command? You cannot use the devinstall for the upgrade package, <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/deployment/install-deployable-package">but you can use it in other scenarios where you install customization packages and hotfixes</a>. It was introduced in Platform Update 12, and is intended for use without the need for local admin privileges.<br />
<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-68808503911379498332018-02-02T21:39:00.001+01:002018-02-02T21:43:53.365+01:00PowerShell script to toggle Maintenance modeIn order to change licence configurations on Operations, <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/sysadmin/maintenance-mode" target="_blank">you need to toggle maintenance mode on or off</a>. This can be done using a Setup tool, but on the development machines where we do not have local admin rights, the only solution would be to hack the database, like <a href="https://kurthatlevik.wordpress.com/2016/03/02/ax-rtw-hack-to-enable-configuration-mode/" target="_blank">Kurt Hatlevik shows us in this blog post</a>.<br />
<br />
In this post I will show how you can toggle maintenance mode on or off using PowerShell. The script is intended for OneBox environments. Just paste it into a new ps1 file for future use, or run it through PowerShell ISE.<br />
<br />
DISCLAIMER: Don't run this unless you are prepared to take the heat from restarting the entire web application. It stops and starts the web server.<br />
<br />
<pre class="brush:powershell;">function ToggleMaintenanceMode()
{
$parm = @{
ServerInstance = 'localhost'
Database = 'AxDB'
Query = "UPDATE SQLSYSTEMVARIABLES SET [VALUE] = IIF([VALUE]=1, 0, 1) WHERE PARM = 'CONFIGURATIONMODE'"
}
Get-Service "W3SVC" | Stop-Service -Force
Invoke-Sqlcmd @parm
Get-Service "W3SVC" | Start-Service
$parm.Query = "SELECT [VALUE] FROM SQLSYSTEMVARIABLES WHERE PARM = 'CONFIGURATIONMODE'"
$result = Invoke-Sqlcmd @parm
[int]$value = $result.Value
Write-Output "Configuration mode $(('disabled','enabled')[$value])"
}
ToggleMaintenanceMode
</pre>
<br />
The script shows you how you can easily run SQL commands, and even retrieve values back to your PowerShell script.<br />
<br />
Enjoy!<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-72846621048915915512018-01-28T19:59:00.000+01:002018-01-29T19:32:56.457+01:00PowerShell script for synchronizing the database<b><span style="color: red;">UPDATE! Just a day after posting this article, I got some valuable feedback that made me rewrite the script. I kept the top part of the post as is, for historical reference, but the new script is below. Keep reading!</span></b><br />
<br />
In this post I want to share a neat way to use a PowerShell script for running the database synchronization when working. You probably already know you can run the database synchronization from within Visual Studio, and that is probably where most developers and consultants will do this operation, but sometimes you want the option to just run a script. Examples of this is <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/database/copy-database-from-sql-server-to-azure-sql" target="_blank">when you copy a database between environments</a>, or during <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-data-to-latest-update" target="_blank">upgrade operations</a>.<br />
<br />
Let's put the script out, and I'll discuss the parts below.<br />
<br />
<pre class="brush:powershell;">#Requires -RunAsAdministrator
Import-Module "$PSScriptRoot\AOSEnvironmentUtilities.psm1" -DisableNameChecking
Import-Module "$PSScriptRoot\CommonRollbackUtilities.psm1" -DisableNameChecking
function Run-DBSync()
{
$SyncToolExecutable = '{0}\bin\Microsoft.Dynamics.AX.Deployment.Setup.exe' -f $(Get-AosWebSitePhysicalPath)
$params = @(
'-bindir', $(Get-AOSPackageDirectory)
'-metadatadir' , $(Get-AOSPackageDirectory)
'-sqluser', $(Get-DataAccessSqlUsr)
'-sqlserver', $(Get-DataAccessDbServer)
'-sqldatabase', $(Get-DataAccessDatabase)
'-setupmode', 'sync'
'-syncmode', 'fullall'
'-isazuresql', 'false'
'-sqlpwd', $(Get-DataAccessSqlPwd)
)
& $SyncToolExecutable $params 2>&1 | Out-String
}
Run-DBSync
</pre>
<br />
Let's look at what this script does. The very first line is just a hint to the runtime that this script must be run in elevated mode. The reason is that it must get some information from the system that requires admin rights. Typically I also stop some services, like the Management Reporter Process Service, before I run the synchronization, and obviously a non-admin will struggle to do that.<br />
<br />
Notice that I have imported some modules, and you may be wondering where I got those. These PowerShell modules are part of the Software Deployable Packages, and either you can create one yourself, or simply download one of those made available by Microsoft in LCS. Extract the package and look under the following path, \AOSService\Scripts. Just grab the two files and make sure you save them alongside your script, like the example below:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-bCuCg7biAgE/Wm4dV79d3KI/AAAAAAACIxE/FkT2xKq1CXY1Edpk0DOVctG1kHj203rLgCLcBGAs/s1600/Script_Files.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="214" data-original-width="538" height="158" src="https://1.bp.blogspot.com/-bCuCg7biAgE/Wm4dV79d3KI/AAAAAAACIxE/FkT2xKq1CXY1Edpk0DOVctG1kHj203rLgCLcBGAs/s400/Script_Files.jpg" width="400" /></a></div>
<br />
<br />
The rest is simply building the parameters for the synchronization operation, and running the tool that does the job. The output is sent to the host, so if you want to look at the result you may want to run this script in PowerShell ISE (Admin mode).<br />
<br />
What is also neat, is that it will pick up the database credentials used for your environment, so you don't have to put those details in the script yourself.<br />
<br />
In any case, it is a neat study in how you can organize your script in such a way that you get the code all in one visible column. It's also a stepping stone to start building your own set of scripts to maintain your development environments.<br />
<br />
Finally, a small disclaimer: Microsoft may very well change how their PowerShell modules work in the future, so if that happens, the script above will have to change.<br />
<br />
<h3>
Updated script - no modules and works for non-admin</h3>
So here is a way to run the database synchronization without having to rely on the PowerShell modules and without having to have local admin rights. Remember this is limited to OneBox environment.<br />
<br />
<pre class="brush:powershell;">function Run-DBSync()
{
# Find the correct Package Local Directory (PLD)
$pldPath = "\AOSService\PackagesLocalDirectory"
$packageDirectory = "{0}:$pldPath" -f ('J','K')[$(Test-Path $("K:$pldPath"))]
$SyncToolExecutable = '{0}\bin\SyncEngine.exe' -f $packageDirectory
$connectionString = "Data Source=localhost; " +
"Integrated Security=True; " +
"Initial Catalog=AxDb"
$params = @(
"-syncmode=`"fullall`""
"-metadatabinaries=$packageDirectory"
"-connect=`"$connectionString`""
)
& $SyncToolExecutable $params 2>&1 | Out-String
}
Run-DBSync
</pre>
<br />
Notice how I feed the parameters to the executable here, in comparisson to the Setup tool above. It is currently stated in the docs that you may want to use the Setup tool during <a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-data-to-latest-update" target="_blank">upgrade scenarios</a>.<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-32646973029757218472018-01-13T18:09:00.001+01:002018-01-13T18:29:54.791+01:00List hotfixes using PowerShell in D365FO (AX7)You probably already know that you can open Visual Studio and from the "Dynamics 365" menu, under "Addins" and "Apply Hotfix", you will find a grid that lists all the hotfixes installed on your environment. The list can be copied and pasted into Excel if you need a better view and you need to filter and search the list. It works, but it could be a bit easier.<br />
<br />
In this post I will share a neat function you can use to list installed hotfixes using PowerShell. It inspired by the <a href="https://blogs.msdn.microsoft.com/axsupport/2016/09/07/find-which-hot-fixes-kbs-you-have-installed-in-microsoft-dynamics-ax/" target="_blank">post from Microsoft Support (Thomas Treen)</a>, and I got some help by some of my fellow MVPs to get inspired (shout out to <a href="http://dev.goshoom.net/en/" target="_blank">Martin Draab</a> and <a href="http://www.axdeveloperconnection.it/webapp/blog" target="_blank">Lane Swenka</a>).<br />
<br />
The function is as follows:<br />
<br />
<pre class="brush:powershell;">function Get-HotfixList()
{
# Find the correct Package Local Directory (PLD)
$pldPath = "\AOSService\PackagesLocalDirectory"
$packageDirectory = "{0}:$pldPath" -f ('J','K')[$(Test-Path $("K:$pldPath"))]
[array]$Updates = @()
# Get all updates XML
foreach ($packagefile in Get-ChildItem $packageDirectory\*\*\AxUpdate\*.xml)
{
[xml]$xml = Get-Content $packagefile
[string]$KBs = $xml.AxUpdate.KBNumbers.string
# One package may refer many KBs
foreach ($KB in $KBs -split " ")
{
[string]$package = $xml.AxUpdate.Name
$moduleFolder = $packagefile.Directory.Parent
$Updates += [PSCustomObject]@{
Module = $moduleFolder.Parent
Model = $moduleFolder
KB = $KB
Package = $package
Folder = $moduleFolder.FullName
}
}
}
return $Updates
}
</pre>
<br />
With this function, you can list out the hotfixes to a resizable, sortable and searchable grid like this:<br />
<br />
<pre class="brush:powershell;">Get-HotfixList | Out-GridView
</pre>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/--DP-KbjrTa4/WlpAtsCGTQI/AAAAAAACIcA/TSQ1Lsm6Hh4FotZVf6hbkOBEXzk1HHEUgCLcBGAs/s1600/OutGridView.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="672" data-original-width="1228" height="347" src="https://1.bp.blogspot.com/--DP-KbjrTa4/WlpAtsCGTQI/AAAAAAACIcA/TSQ1Lsm6Hh4FotZVf6hbkOBEXzk1HHEUgCLcBGAs/s640/OutGridView.jpg" width="640" /></a></div>
<br />
You can list out the hotfixes into a long string where each KB number is separated by a space. Then copy this string into LCS when searching for KBs you want to use in a Hotfix Bundle.<br />
<br />
<pre class="brush:powershell;">$list = Get-HotfixList | select KB | sort KB
$list = [string]::Join(" ", $list.KB)
$list
</pre>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-cDr0Y2cHkmw/WlpA4T2q-wI/AAAAAAACIcE/cyFAQNUVAloXfKsYFE7hj71iJfMFJVjcwCLcBGAs/s1600/ListHotfixes.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="321" data-original-width="1340" height="152" src="https://1.bp.blogspot.com/-cDr0Y2cHkmw/WlpA4T2q-wI/AAAAAAACIcE/cyFAQNUVAloXfKsYFE7hj71iJfMFJVjcwCLcBGAs/s640/ListHotfixes.jpg" width="640" /></a></div>
<br />
Obviously you can use the function to quickly search for a specific hotfix.<br />
<br />
<pre class="brush:powershell;">Get-HotfixList | where {$_.KB -eq "4055564"}
</pre>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-UoC1FsBHPFQ/WlpA-nXMscI/AAAAAAACIcI/jFV2TyqY7vQHwakmliBJ6NrQGVrwkfSdwCLcBGAs/s1600/ContentOfKB.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="330" data-original-width="1307" height="160" src="https://1.bp.blogspot.com/-UoC1FsBHPFQ/WlpA-nXMscI/AAAAAAACIcI/jFV2TyqY7vQHwakmliBJ6NrQGVrwkfSdwCLcBGAs/s640/ContentOfKB.jpg" width="640" /></a></div>
<br />
And one final example, when installing a hotfix bundle, one of the steps are to compile the modules patched, and while you can do a full compile of all modules in the application, you could also just compile only the ones patched. To create a distinct list of modules, run the following statement.<br />
<br />
<pre class="brush:powershell;">Get-HotfixList | select module | sort module | Get-Unique -AsString
</pre>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-GXk8IB2JgJ8/WlpBF2Ax3oI/AAAAAAACIcM/O0RX0zfxSrEKwzQuqUDGNZJ9qEvebfj-QCLcBGAs/s1600/ListModules.jpg" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="609" data-original-width="455" height="320" src="https://1.bp.blogspot.com/-GXk8IB2JgJ8/WlpBF2Ax3oI/AAAAAAACIcM/O0RX0zfxSrEKwzQuqUDGNZJ9qEvebfj-QCLcBGAs/s320/ListModules.jpg" width="239" /></a></div>
<br />
A quick note on the Package Local Directory (PLD) path. In my script I shift between K and J drive. I have only used this script on VMs in the cloud. If you need to run this where the PLD path is on some other drive, you will need to change that in the script.tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-81462441838750782612017-10-22T22:21:00.003+02:002017-10-23T17:08:48.028+02:00Use Azure Automation to start and stop your VMs on a scheduleThis post is long overdue, and I have been meaning to post it over a year ago. I did present an early version of this script at the AXUG event in Stuttgart, but since then the API has changed around tags, and also it has become very easy to solve authentication using Run as Accounts. The code I am sharing here works on the latest version of the modules, and I hope it will keep working for years to come.<br />
<br />
I few notes before I continue:<br />
<ul>
<li>I base this script off from <a href="https://automys.com/library/asset/scheduled-virtual-machine-shutdown-startup-microsoft-azure" target="_blank">Automys own code</a>, and it is heavily inspired by the commits done by other users out there in the community. I will refer to <a href="https://github.com/slapointe/Azure-Automation-Scheduled-VM-Shutdown" target="_blank">the project on GitHub</a> where you will find contributors and authors. </li>
<li>I've only tested and used the script for ARM Resources</li>
<li>I removed references to credentials and certificates and it relies on using "Run As Account". Setting up "Run As Account" in Azure is very easy and quick to do.</li>
</ul>
<div>
You will find the Feature branch here:</div>
<div>
<a href="https://github.com/skaue/Azure-Automation-Scheduled-VM-Shutdown/tree/features/RunAsAccount">https://github.com/skaue/Azure-Automation-Scheduled-VM-Shutdown/tree/features/RunAsAccount</a></div>
<br />
<h3>
Setup</h3>
I recommend starting by creating a new Automation Account. Yes, you can probably reuse an existing one, but creating a new account does not incur additional costs, and you can get this up and running fairly quick and easy just by following the steps in this blog post.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-kaoCuyxQ-NE/Wez8BliulPI/AAAAAAACGlY/TUU7iCXRtuYLlw4D0yWOKgdkBP02Q9sXACLcBGAs/s1600/AddAutomationAccount.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="1600" data-original-width="589" height="640" src="https://1.bp.blogspot.com/-kaoCuyxQ-NE/Wez8BliulPI/AAAAAAACGlY/TUU7iCXRtuYLlw4D0yWOKgdkBP02Q9sXACLcBGAs/s640/AddAutomationAccount.png" width="234" /></a></div>
<br />
<br />
Make sure you select "Yes" on the option of creating "Azure Run as Account". Let it create all the artifacts and while you wait you can read the rest of this post.<br />
<br />
When the Automation account is up and running, the next step is to create a new Runbook of type "PowerShell" - just straight up PowerShell, and no fancy stuff.<br />
<br />
Then you grab the script from my feature branch based off the original trunk. You can either take the script from this post, or take the latest from GitHub. I probably won't maintain this blog post on any future updates of the script, but I might maintain the one on GitHub. I'll put a copy down below.<br />
<br />
So with the script added as a PowerShell Runbook, and saved. Now you need to Schedule it. This is where a small cost may incur, because it is necessary to set the Runbook to run every hour. Yes - every hour. Using Automation for free only allow for a limited number of runs, and with the Runbook running every hour throughout the day, I believe it will stop running after 20 days - per month. There is a 500 minute limit per month for free, but the cost incurred when you exceed this is extremely low.<br />
<br />
With the script running every hour you are ready to schedule "downtime". And this is easy.<br />
You basically just either TAG the VM or the Resource Group holding a collection of VMs.<br />
<br />
By TAG I mean you type on the downtime you want for your resource in the VALUE of a specific TAG. The script looks for a tag named "AutoShutdownSchedule". Example of value would be "20:00->06:00, Saturday, Sunday", and you can probably guess when the server will be shutdown with that value... That is correct, all weekdays between 8 pm at night and 6 am in the morning. You can imagine the flexibility this gives.<br />
<br />
<h3>
Added Features</h3>
In addition, the script is inspired by other nice ideas from the community, like providing a TimeZone for your schedule, just to ensure your 8 pm is consistent to when the script interprets the value.<br />
<br />
Another feature added is the ability to use a "NeverStart" value keyword, to enforce the resource does not start. You can use this to schedule automatic shutdown that does not trigger startup again after the schedule ends. Example is the value "20:00->21:00,NeverStart". This would stop the resource at 8 pm, and when the RunBook runs again at 9 pm, the resource will not start even though the schedule has ended.<br />
<br />
Finally, I want to comment the added feature of disabling the schedule without removing the schedule. If you provide an additional tag with the name "AutoShutdownDisabled" with a value of Yes/1/True. This means you can keep the schedule and temporarily disable the shutdown schedule altogether.<br />
<br />
<h3>
The script</h3>
<pre class="brush:powershell;"><#
.SYNOPSIS
This Azure Automation runbook automates the scheduled shutdown and startup of resources in an Azure subscription.
.DESCRIPTION
The runbook implements a solution for scheduled power management of Azure resources in combination with tags
on resources or resource groups which define a shutdown schedule. Each time it runs, the runbook looks for all
supported resources or resource groups with a tag named "AutoShutdownSchedule" having a value defining the schedule,
e.g. "10PM -> 6AM". It then checks the current time against each schedule entry, ensuring that resourcess with tags or in tagged groups
are deallocated/shut down or started to conform to the defined schedule.
This is a PowerShell runbook, as opposed to a PowerShell Workflow runbook.
This script requires the "AzureRM.Resources" modules which are present by default in Azure Automation accounts.
For detailed documentation and instructions, see:
CREDITS: Initial version credits goes to automys from which this script started :
https://automys.com/library/asset/scheduled-virtual-machine-shutdown-startup-microsoft-azure
.PARAMETER Simulate
If $true, the runbook will not perform any power actions and will only simulate evaluating the tagged schedules. Use this
to test your runbook to see what it will do when run normally (Simulate = $false).
.PARAMETER DefaultScheduleIfNotPresent
If provided, will set the default schedule to apply on all resources that don't have any scheduled tag value defined or inherited.
Description | Tag value
Shut down from 10PM to 6 AM UTC every day | 10pm -> 6am
Shut down from 10PM to 6 AM UTC every day (different format, same result as above) | 22:00 -> 06:00
Shut down from 8PM to 12AM and from 2AM to 7AM UTC every day (bringing online from 12-2AM for maintenance in between) | 8PM -> 12AM, 2AM -> 7AM
Shut down all day Saturday and Sunday (midnight to midnight) | Saturday, Sunday
Shut down from 2AM to 7AM UTC every day and all day on weekends | 2:00 -> 7:00, Saturday, Sunday
Shut down on Christmas Day and New Year?s Day | December 25, January 1
Shut down from 2AM to 7AM UTC every day, and all day on weekends, and on Christmas Day | 2:00 -> 7:00, Saturday, Sunday, December 25
Shut down always ? I don?t want this VM online, ever | 0:00 -> 23:59:59
.PARAMETER TimeZone
Defines the Timezone used when running the runbook. "GMT Standard Time" by default.
Microsoft Time Zone Index Values:
https://msdn.microsoft.com/en-us/library/ms912391(v=winembedded.11).aspx
.EXAMPLE
For testing examples, see the documentation at:
https://automys.com/library/asset/scheduled-virtual-machine-shutdown-startup-microsoft-azure
.INPUTS
None.
.OUTPUTS
Human-readable informational and error messages produced during the job. Not intended to be consumed by another runbook.
#>
[CmdletBinding()]
param(
[parameter(Mandatory=$false)]
[bool]$Simulate = $false,
[parameter(Mandatory=$false)]
[string]$DefaultScheduleIfNotPresent,
[parameter(Mandatory=$false)]
[String] $Timezone = "W. Europe Standard Time"
)
$VERSION = '3.3.0'
$autoShutdownTagName = 'AutoShutdownSchedule'
$autoShutdownOrderTagName = 'ProcessingOrder'
$autoShutdownDisabledTagName = 'AutoShutdownDisabled'
$defaultOrder = 1000
$ResourceProcessors = @(
@{
ResourceType = 'Microsoft.ClassicCompute/virtualMachines'
PowerStateAction = { param([object]$Resource, [string]$DesiredState) (Get-AzureRmResource -ResourceId $Resource.ResourceId).Properties.InstanceView.PowerState }
StartAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'start' -Force }
DeallocateAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'shutdown' -Force }
},
@{
ResourceType = 'Microsoft.Compute/virtualMachines'
PowerStateAction = {
param([object]$Resource, [string]$DesiredState)
$vm = Get-AzureRmVM -ResourceGroupName $Resource.ResourceGroupName -Name $Resource.Name -Status
$currentStatus = $vm.Statuses | Where-Object Code -like 'PowerState*'
$currentStatus.Code -replace 'PowerState/',''
}
StartAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'start' -Force }
DeallocateAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'deallocate' -Force }
},
@{
ResourceType = 'Microsoft.Compute/virtualMachineScaleSets'
#since there is no way to get the status of a VMSS, we assume it is in the inverse state to force the action on the whole VMSS
PowerStateAction = { param([object]$Resource, [string]$DesiredState) if($DesiredState -eq 'StoppedDeallocated') { 'Started' } else { 'StoppedDeallocated' } }
StartAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'start' -Parameters @{ instanceIds = @('*') } -Force }
DeallocateAction = { param([string]$ResourceId) Invoke-AzureRmResourceAction -ResourceId $ResourceId -Action 'deallocate' -Parameters @{ instanceIds = @('*') } -Force }
}
)
# Define function to get current date using the TimeZone Paremeter
function GetCurrentDate
{
return [system.timezoneinfo]::ConvertTime($(Get-Date),$([system.timezoneinfo]::GetSystemTimeZones() | ? id -eq $Timezone))
}
# Define function to check current time against specified range
function Test-ScheduleEntry ([string]$TimeRange)
{
# Initialize variables
$rangeStart, $rangeEnd, $parsedDay = $null
$currentTime = GetCurrentDate
$midnight = $currentTime.AddDays(1).Date
try
{
# Parse as range if contains '->'
if($TimeRange -like '*->*')
{
$timeRangeComponents = $TimeRange -split '->' | foreach {$_.Trim()}
if($timeRangeComponents.Count -eq 2)
{
$rangeStart = Get-Date $timeRangeComponents[0]
$rangeEnd = Get-Date $timeRangeComponents[1]
# Check for crossing midnight
if($rangeStart -gt $rangeEnd)
{
# If current time is between the start of range and midnight tonight, interpret start time as earlier today and end time as tomorrow
if($currentTime -ge $rangeStart -and $currentTime -lt $midnight)
{
$rangeEnd = $rangeEnd.AddDays(1)
}
# Otherwise interpret start time as yesterday and end time as today
else
{
$rangeStart = $rangeStart.AddDays(-1)
}
}
}
else
{
Write-Output "`tWARNING: Invalid time range format. Expects valid .Net DateTime-formatted start time and end time separated by '->'"
}
}
# Otherwise attempt to parse as a full day entry, e.g. 'Monday' or 'December 25'
else
{
# If specified as day of week, check if today
if([System.DayOfWeek].GetEnumValues() -contains $TimeRange)
{
if($TimeRange -eq (Get-Date).DayOfWeek)
{
$parsedDay = Get-Date '00:00'
}
else
{
# Skip detected day of week that isn't today
}
}
# Otherwise attempt to parse as a date, e.g. 'December 25'
else
{
$parsedDay = Get-Date $TimeRange
}
if($parsedDay -ne $null)
{
$rangeStart = $parsedDay # Defaults to midnight
$rangeEnd = $parsedDay.AddHours(23).AddMinutes(59).AddSeconds(59) # End of the same day
}
}
}
catch
{
# Record any errors and return false by default
Write-Output "`tWARNING: Exception encountered while parsing time range. Details: $($_.Exception.Message). Check the syntax of entry, e.g. '<starttime> -> <endtime>', or days/dates like 'Sunday' and 'December 25'"
return $false
}
# Check if current time falls within range
if($currentTime -ge $rangeStart -and $currentTime -le $rangeEnd)
{
return $true
}
else
{
return $false
}
} # End function Test-ScheduleEntry
# Function to handle power state assertion for resources
function Assert-ResourcePowerState
{
param(
[Parameter(Mandatory=$true)]
[object]$Resource,
[Parameter(Mandatory=$true)]
[string]$DesiredState,
[bool]$Simulate
)
$processor = $ResourceProcessors | Where-Object ResourceType -eq $Resource.ResourceType
if(-not $processor) {
throw ('Unable to find a resource processor for type ''{0}''. Resource: {1}' -f $Resource.ResourceType, ($Resource | ConvertTo-Json -Depth 5000))
}
# If should be started and isn't, start resource
$currentPowerState = & $processor.PowerStateAction -Resource $Resource -DesiredState $DesiredState
if($DesiredState -eq 'Started' -and $currentPowerState -notmatch 'Started|Starting|running')
{
if($Simulate)
{
Write-Output "`tSIMULATION -- Would have started resource. (No action taken)"
}
else
{
Write-Output "`tStarting resource"
& $processor.StartAction -ResourceId $Resource.ResourceId
}
}
# If should be stopped and isn't, stop resource
elseif($DesiredState -eq 'StoppedDeallocated' -and $currentPowerState -notmatch 'Stopped|deallocated')
{
if($Simulate)
{
Write-Output "`tSIMULATION -- Would have stopped resource. (No action taken)"
}
else
{
Write-Output "`tStopping resource"
& $processor.DeallocateAction -ResourceId $Resource.ResourceId
}
}
# Otherwise, current power state is correct
else
{
Write-Output "`tCurrent power state [$($currentPowerState)] is correct."
}
}
# Main runbook content
try
{
$currentTime = GetCurrentDate
Write-Output "Runbook started. Version: $VERSION"
if($Simulate)
{
Write-Output '*** Running in SIMULATE mode. No power actions will be taken. ***'
}
else
{
Write-Output '*** Running in LIVE mode. Schedules will be enforced. ***'
}
Write-Output "Current UTC/GMT time [$($currentTime.ToString('dddd, yyyy MMM dd HH:mm:ss'))] will be checked against schedules"
$Conn = Get-AutomationConnection -Name AzureRunAsConnection
$resourceManagerContext = Add-AzureRMAccount -ServicePrincipal -Tenant $Conn.TenantID -ApplicationId $Conn.ApplicationID -CertificateThumbprint $Conn.CertificateThumbprint
$resourceList = @()
# Get a list of all supported resources in subscription
$ResourceProcessors | % {
Write-Output ('Looking for resources of type {0}' -f $_.ResourceType)
$resourceList += @(Find-AzureRmResource -ResourceType $_.ResourceType)
}
$ResourceList | % {
if($_.Tags -and $_.Tags.ContainsKey($autoShutdownOrderTagName) ) {
$order = $_.Tags | % { if($_.ContainsKey($autoShutdownOrderTagName)) { $_.Item($autoShutdownOrderTagName) } }
} else {
$order = $defaultOrder
}
Add-Member -InputObject $_ -Name ProcessingOrder -MemberType NoteProperty -TypeName Integer -Value $order
}
$ResourceList | % {
if($_.Tags -and $_.Tags.ContainsKey($autoShutdownDisabledTagName) ) {
$disabled = $_.Tags | % { if($_.ContainsKey($autoShutdownDisabledTagName)) { $_.Item($autoShutdownDisabledTagName) } }
} else {
$disabled = '0'
}
Add-Member -InputObject $_ -Name ScheduleDisabled -MemberType NoteProperty -TypeName String -Value $disabled
}
# Get resource groups that are tagged for automatic shutdown of resources
$taggedResourceGroups = Find-AzureRmResourceGroup -Tag @{ "AutoShutdownSchedule" = $null }
$taggedResourceGroupNames = @($taggedResourceGroups | select Name)
Write-Output "Found [$($taggedResourceGroupNames.Count)] schedule-tagged resource groups in subscription"
if($DefaultScheduleIfNotPresent) {
Write-Output "Default schedule was specified, all non tagged resources will inherit this schedule: $DefaultScheduleIfNotPresent"
}
# For each resource, determine
# - Is it directly tagged for shutdown or member of a tagged resource group
# - Is the current time within the tagged schedule
# Then assert its correct power state based on the assigned schedule (if present)
Write-Output "Processing [$($resourceList.Count)] resources found in subscription"
foreach($resource in $resourceList)
{
$schedule = $null
if ($resource.ScheduleDisabled)
{
$disabledValue = $resource.ScheduleDisabled
if ($disabledValue -eq "1" -or $disabledValue -eq "Yes"-or $disabledValue -eq "True")
{
Write-Output "[$($resource.Name)]: `r`n`tIGNORED -- Found direct resource schedule with $autoShutdownDisabledTagName value: $disabledValue."
continue
}
}
# Check for direct tag or group-inherited tag
if($resource.Tags.Count -gt 0 -and $resource.Tags.ContainsKey($autoShutdownTagName) -eq $true)
{
# Resource has direct tag (possible for resource manager deployment model resources). Prefer this tag schedule.
$schedule = $resource.Tags.Item($autoShutdownTagName)
Write-Output "[$($resource.Name)]: `r`n`tADDING -- Found direct resource schedule tag with value: $schedule"
}
elseif($taggedResourceGroupNames -contains $resource.ResourceGroupName)
{
# resource belongs to a tagged resource group. Use the group tag
$parentGroup = $resourceGroups | Where-Object Name -eq $resource.ResourceGroupName
$schedule = $parentGroup.Tags.Item($AUTOSHUTDOWNSCHEDULE_KEYWORD)
Write-Output "[$($resource.Name)]: `r`n`tADDING -- Found parent resource group schedule tag with value: $schedule"
}
elseif($DefaultScheduleIfNotPresent)
{
$schedule = $DefaultScheduleIfNotPresent
Write-Output "[$($resource.Name)]: `r`n`tADDING -- Using default schedule: $schedule"
}
else
{
# No direct or inherited tag. Skip this resource.
Write-Output "[$($resource.Name)]: `r`n`tIGNORED -- Not tagged for shutdown directly or via membership in a tagged resource group. Skipping this resource."
continue
}
# Check that tag value was succesfully obtained
if($schedule -eq $null)
{
Write-Output "[$($resource.Name) `- $($resource.ProcessingOrder)]: `r`n`tIGNORED -- Failed to get tagged schedule for resource. Skipping this resource."
continue
}
# Parse the ranges in the Tag value. Expects a string of comma-separated time ranges, or a single time range
$timeRangeList = @($schedule -split ',' | foreach {$_.Trim()})
# Check each range against the current time to see if any schedule is matched
$scheduleMatched = $false
$matchedSchedule = $null
$neverStart = $false #if NeverStart is specified in range, do not wake-up machine
foreach($entry in $timeRangeList)
{
if((Test-ScheduleEntry -TimeRange $entry) -eq $true)
{
$scheduleMatched = $true
$matchedSchedule = $entry
break
}
if ($entry -eq "NeverStart")
{
$neverStart = $true
}
}
Add-Member -InputObject $resource -Name ScheduleMatched -MemberType NoteProperty -TypeName Boolean -Value $scheduleMatched
Add-Member -InputObject $resource -Name MatchedSchedule -MemberType NoteProperty -TypeName Boolean -Value $matchedSchedule
Add-Member -InputObject $resource -Name NeverStart -MemberType NoteProperty -TypeName Boolean -Value $neverStart
}
foreach($resource in $resourceList | Group-Object ScheduleMatched) {
if($resource.Name -eq '') {continue}
$sortedResourceList = @()
if($resource.Name -eq $false) {
# meaning we start resources, lower to higher
$sortedResourceList += @($resource.Group | Sort ProcessingOrder)
} else {
$sortedResourceList += @($resource.Group | Sort ProcessingOrder -Descending)
}
foreach($resource in $sortedResourceList)
{
# Enforce desired state for group resources based on result.
if($resource.ScheduleMatched)
{
# Schedule is matched. Shut down the resource if it is running.
Write-Output "[$($resource.Name) `- P$($resource.ProcessingOrder)]: `r`n`tASSERT -- Current time [$currentTime] falls within the scheduled shutdown range [$($resource.MatchedSchedule)]"
Add-Member -InputObject $resource -Name DesiredState -MemberType NoteProperty -TypeName String -Value 'StoppedDeallocated'
}
else
{
if ($resource.NeverStart)
{
Write-Output "[$($resource.Name)]: `tIGNORED -- Resource marked with NeverStart. Keeping the resources stopped."
Add-Member -InputObject $resource -Name DesiredState -MemberType NoteProperty -TypeName String -Value 'StoppedDeallocated'
}
else
{
# Schedule not matched. Start resource if stopped.
Write-Output "[$($resource.Name) `- P$($resource.ProcessingOrder)]: `r`n`tASSERT -- Current time falls outside of all scheduled shutdown ranges. Start resource."
Add-Member -InputObject $resource -Name DesiredState -MemberType NoteProperty -TypeName Boolean -Value 'Started'
}
}
Assert-ResourcePowerState -Resource $resource -DesiredState $resource.DesiredState -Simulate $Simulate
}
}
Write-Output 'Finished processing resource schedules'
}
catch
{
$errorMessage = $_.Exception.Message
throw "Unexpected exception: $errorMessage"
}
finally
{
Write-Output "Runbook finished (Duration: $(('{0:hh\:mm\:ss}' -f ((GetCurrentDate) - $currentTime))))"
}
</endtime></starttime></pre>
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-24633482621683286152017-10-02T18:45:00.000+02:002017-10-04T16:40:01.693+02:00Importing users in D365 Operations using ExcelLet me start off by admitting I was initially thinking about naming this post "Importing users in AX7 using Excel", so there, now this post suddenly became a little bit more "searcher friendly".<br />
<br />
In this post I will show how easy you can connect to your Dynamics 365 Operations instance using Excel. Before I begin the post, let me just remind you that importing users from Azure Active Directory is perhaps easier and quicker. So this post is just to show you it is also possible to import users using Excel with the Dynamics Office Add-in.<br />
<br />
You may have seen the Data Entity "System User" (SystemUserEntity), and you may have tried using it to add users with it, and furthermore you may also have seen the error "A row created in data set SystemUser was not published. Error message: Write failed for table row of type 'SystemUserEntity'. Infolog: Error: Error in getting SID."<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-sTXx0yIIkow/WdJp6qzJYsI/AAAAAAACGiU/UOhOHWq16yoN-2N5Sbxduvk0SzHH5OvAACLcBGAs/s1600/0-ErrorAdding.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="149" data-original-width="315" height="188" src="https://3.bp.blogspot.com/-sTXx0yIIkow/WdJp6qzJYsI/AAAAAAACGiU/UOhOHWq16yoN-2N5Sbxduvk0SzHH5OvAACLcBGAs/s400/0-ErrorAdding.png" width="400" /></a></div>
<br />
You will get the same error through Excel if you do not provide some additional columns and information while trying to create a user through that Data Entity.<br />
<br />
You can either start off with opening Excel, install the Dynamics Office Add-in and connect it to the target instance. Or you can open the list of users directly on the instance, and open the list in Excel from there. Either way you should start with a view where you have the System User list in your spreadsheet.<br />
<br />
The next step is to modify the Design of the view. Click the Design link first.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-VYn71zFNPr0/WdJqC_wwwFI/AAAAAAACGiY/2PIfCDvqYrII62XxSU7ooWsBD2yxgJcCwCLcBGAs/s1600/1-Design.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="325" data-original-width="338" height="383" src="https://4.bp.blogspot.com/-VYn71zFNPr0/WdJqC_wwwFI/AAAAAAACGiY/2PIfCDvqYrII62XxSU7ooWsBD2yxgJcCwCLcBGAs/s400/1-Design.png" width="400" /></a></div>
<br />
<br />
Then edit the System User table.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-TNX4XERguYI/WdJqd8mz41I/AAAAAAACGic/44jWqGc3Xd8RAk8YLgRj8Cxoee8xI-u7gCLcBGAs/s1600/2-SystemUserTable.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="121" data-original-width="302" height="160" src="https://2.bp.blogspot.com/-TNX4XERguYI/WdJqd8mz41I/AAAAAAACGic/44jWqGc3Xd8RAk8YLgRj8Cxoee8xI-u7gCLcBGAs/s400/2-SystemUserTable.png" width="400" /></a></div>
<br />
<br />
Then add the following following columns: Enabled, AccountType and Alias (Email).<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-yRy8jVCaGp8/WdJqkaL8kAI/AAAAAAACGig/fbtzRt-XJVAD9zY2u1Ri4ky3rAeaYdAngCLcBGAs/s1600/3-AddFields.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="514" data-original-width="338" height="400" src="https://3.bp.blogspot.com/-yRy8jVCaGp8/WdJqkaL8kAI/AAAAAAACGig/fbtzRt-XJVAD9zY2u1Ri4ky3rAeaYdAngCLcBGAs/s400/3-AddFields.png" width="262" /></a></div>
<br />
<br />
Save the design changes and ensure you update the view so the added columns are populated with data.<br />
<br />
You will notice the Type (AccountType) and Alias (Email) carry important information for how the user authenticates, in addition to the Provider column. With these columns properly populated, you should be able to add multiple rows and hit a single "Publish" from within Excel.<br />
<br />
Given this, you can have two open Excel instances, and connect to two different instances. And then copy over users from a source to a target using Excel. As long as all the columns are available and in the same order, of course.<br />
<br />
This post should also give you some clue to how you can use Data Management to populate a system with users through a Data Package, if that is your preference.<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-81253305425496835512017-09-25T15:23:00.002+02:002017-09-25T15:55:25.508+02:00Initial steps to troubleshoot failed environment servicingOn the topic of patching and updating an existing D365 Operations environment I will refer to the online documentation.<br />
<ul>
<li><a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-latest-platform-update" target="_blank">Upgrade Finance and Operations to the latest platform update</a></li>
<li><a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/migration-upgrade/upgrade-latest-update" target="_blank">Process for moving to the latest update of Finance and Operations</a></li>
<li><a href="https://docs.microsoft.com/en-us/dynamics365/unified-operations/dev-itpro/deployment/apply-deployable-package-system" target="_blank">Apply a deployable package to a Finance and Operations environment</a></li>
</ul>
<div>
There are also some great community posts that aims to help you, and you may want to check out.</div>
<div>
<ul>
<li><a href="http://daxmusings.codecrib.com/2017/06/upgrades-updates-and-hotfixes-in-ax7.html" target="_blank">Upgrades, Updates and Hotfixes in AX7</a></li>
<li><a href="http://dev.goshoom.net/en/2016/11/installing-deployable-packages-with-powershell/" target="_blank">Installing deployable packages with Powershell</a></li>
</ul>
<div>
I expect more posts to show up. As it is of writing this, installing updates can be a bit tedious and cumbersome. </div>
<div>
<br /></div>
<div>
I will use this post to share a recent update that failed on a Platform Update. A Platform Update is expected to be a fairly straight forward and safe operation. You simply import the update to your assets in LCS and apply it to your environment (assuming you're running your environment in the cloud). I will not discuss On-Premise in this post. </div>
</div>
<div>
<br /></div>
<div>
I had an environment running application 1611 with Platform Update 7. I was trying to install Platform Update 10. After it failed on several attempts, I started to try investigate why it failed. </div>
<div>
<br /></div>
<div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-ppMTZ2AahzI/Wcj7lGU9WpI/AAAAAAACGeg/BH0VbCyZXCooZWygpXEvEcMs3i6Lqe6awCLcBGAs/s1600/Email_From_LCS.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="179" data-original-width="484" height="147" src="https://4.bp.blogspot.com/-ppMTZ2AahzI/Wcj7lGU9WpI/AAAAAAACGeg/BH0VbCyZXCooZWygpXEvEcMs3i6Lqe6awCLcBGAs/s400/Email_From_LCS.png" width="400" /></a></div>
</div>
<div>
<br /></div>
<div>
Here are the steps I took.</div>
<div>
<br /></div>
<div>
1) Identify which step failed. In my case it was step 13. (Not exactly my lucky number)</div>
<div>
<br /></div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-IGqLjCLxlZc/Wcj7t_6e57I/AAAAAAACGek/7YvQX2bDLB8E5wPfy62KgL0-M7BFrveAgCLcBGAs/s1600/LCS_Failed.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="387" data-original-width="1018" height="242" src="https://3.bp.blogspot.com/-IGqLjCLxlZc/Wcj7t_6e57I/AAAAAAACGek/7YvQX2bDLB8E5wPfy62KgL0-M7BFrveAgCLcBGAs/s640/LCS_Failed.png" width="640" /></a></div>
<div>
<br /></div>
<div>
<br /></div>
<div>
2) Find the runbook output (normally under C:\RunbookOutput) and find the PowerShell Script that fails. I simply searched the log for "<id>13"</id></div>
<div>
<br /></div>
<div>
3) Open PowerShell ISE in Admin mode and open the PowerShell Script. You will find the script in the J:\DeployablePackages folder, and you can match the GUID from the log with the Runbook folder. The Scripts will be located in a standardized folder path.</div>
<div>
<br /></div>
<div>
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-3ln2vlfHLBw/Wcj70gJcKmI/AAAAAAACGeo/Ijz0sXOOnWEvtUkqx7g8J-oyspMlW-kRgCLcBGAs/s1600/Find_PS_Script.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="320" data-original-width="1177" height="174" src="https://2.bp.blogspot.com/-3ln2vlfHLBw/Wcj70gJcKmI/AAAAAAACGeo/Ijz0sXOOnWEvtUkqx7g8J-oyspMlW-kRgCLcBGAs/s640/Find_PS_Script.png" width="640" /></a></div>
</div>
<div>
<br /></div>
<div>
4) Run the script and let it fail. From there you can add breakpoints and try run again and step through to try see why it failed. Use whatever you find as information when you contact Microsoft Support. <b>Some updates fails, but should not fail, and it is important that anyone with support agreements make sure to report findings back to Microsoft. </b></div>
<div>
<br /></div>
<div>
Now, in my particular case, the script did not fail when I ran it manually. It succeeded. I can only guess to why that is the case, but after going back to LCS and letting the update "Resume" it eventually finished with all the upgrade steps successfully. </div>
<div>
<br /></div>
<div>
In any case, the initial steps above can help you push through a failing update and potentially lead you to the answer why an update unexpectedly failed. </div>
tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-36620623097429807452017-08-23T15:26:00.001+02:002017-08-23T15:43:32.477+02:00Consider changing your password Pop UpCurrently the machines deployed through LCS runs with the Account Policy that passwords has a 42 days age. Interestingly you should not change the password for servers deployed<a href="https://blogs.msdn.microsoft.com/lcs/2017/06/30/guidelines-for-environments-in-microsoft-subscription/" target="_blank"> according to this statement of guidelines</a>.<br />
<br />
So if you get annoyed by the reminder to change the password and do not plan to republish the box any time soon, why not go ahead and get rid of the pop up.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-eqJKdPC__F4/WZ2CAhLLMZI/AAAAAAACGa0/NkKhTzo7xtQ9oGxEYC9qlEEzCuCSvfMZwCLcBGAs/s1600/popup.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="121" data-original-width="337" src="https://4.bp.blogspot.com/-eqJKdPC__F4/WZ2CAhLLMZI/AAAAAAACGa0/NkKhTzo7xtQ9oGxEYC9qlEEzCuCSvfMZwCLcBGAs/s1600/popup.png" /></a></div>
<br />
Click the start button and type in "local". You should find the Local Security Policy Console.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-WVNwG0a4aaY/WZ2CEkUi-II/AAAAAAACGa4/90r01ohDFZgAZ3YiVvY-r1FSipY1ihEVwCLcBGAs/s1600/localsecpolicy.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="208" data-original-width="314" src="https://2.bp.blogspot.com/-WVNwG0a4aaY/WZ2CEkUi-II/AAAAAAACGa4/90r01ohDFZgAZ3YiVvY-r1FSipY1ihEVwCLcBGAs/s1600/localsecpolicy.png" /></a></div>
<br />
<br />
From there it is just a matter of changing the expiration of the password to something other than 42, or simply set it to 0 for "never expire".<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-fHt36NO07uo/WZ2CIFYQE0I/AAAAAAACGa8/oKNZFlIX7GkiCA3r6BCm_78LJOV5a9-0gCLcBGAs/s1600/ChangePolicyMaxPassAge.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="357" data-original-width="799" height="285" src="https://2.bp.blogspot.com/-fHt36NO07uo/WZ2CIFYQE0I/AAAAAAACGa8/oKNZFlIX7GkiCA3r6BCm_78LJOV5a9-0gCLcBGAs/s640/ChangePolicyMaxPassAge.png" width="640" /></a></div>
<br />
<br />
Quick and easy.<br />
<br />
Alternatively you can use a Command Prompt (Run as Admin) with the statement:<br />
<br />
<span style="font-family: "courier new" , "courier" , monospace;">net accounts /maxpwage:unlimited</span>tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-73922609960894110382017-08-09T11:37:00.000+02:002017-08-09T11:37:04.058+02:00Excel Applet not loading when working with D365 Office Add-inThis post is somewhat <a href="https://ievgensaxblog.wordpress.com/2017/03/21/d365o-issues-with-microsoft-dynamics-app-for-office-setup/" target="_blank">related to the post by Ievgen (AX MVP) on the Office Word integration not working with Dynamics 365 for Finance and Operations (Enterprise Edition)</a>.<br />
<br />
If you try to connect Excel to an existing environment using the Microsoft Dynamics Office Add-in and all you see is the text "Load applets" after signing in, then it might very well be because the applets needs to be initiated from within the environment.<br />
<br />
If you click the little flag at the bottom right, you will be able to open the messages and see the error "No applet registrations found".<br />
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-F4OYUeuUZCU/WYrXBQuIQYI/AAAAAAACGaM/hAJOL1JzvDkl-Z3p2-HXi1S75GTcQ-mRgCLcBGAs/s1600/error_message.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="321" data-original-width="622" height="206" src="https://3.bp.blogspot.com/-F4OYUeuUZCU/WYrXBQuIQYI/AAAAAAACGaM/hAJOL1JzvDkl-Z3p2-HXi1S75GTcQ-mRgCLcBGAs/s400/error_message.png" width="400" /></a></div>
<br />
Solution is simple. Open the D365 environment in your favorite browser (assuming your favorite browser is on the compatible list - hehe) and either search for the form (ie type in "Office") or navigate directly through System Administration, Setup and Office app parameters<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-uQAWOTLNrWI/WYrXH-yZkRI/AAAAAAACGaQ/MdTqphttVQsLzwUp_bEkSw8LpjotwYM7wCLcBGAs/s1600/office_search.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="87" data-original-width="345" height="80" src="https://4.bp.blogspot.com/-uQAWOTLNrWI/WYrXH-yZkRI/AAAAAAACGaQ/MdTqphttVQsLzwUp_bEkSw8LpjotwYM7wCLcBGAs/s320/office_search.png" width="320" /></a></div>
<br />
If you see an emply grid, then the settings have not been initialized and that is the problem. Most likely you are missing settings for Office apps in general, so go ahead and initialize all parameters for all the grids accordingly.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-z0qpO4rXGis/WYrXNY5VW6I/AAAAAAACGaU/kwL3SFv6PUQLNpGciH05UxlGVSO4IWKswCLcBGAs/s1600/Registered_applets.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="286" data-original-width="792" height="230" src="https://1.bp.blogspot.com/-z0qpO4rXGis/WYrXNY5VW6I/AAAAAAACGaU/kwL3SFv6PUQLNpGciH05UxlGVSO4IWKswCLcBGAs/s640/Registered_applets.png" width="640" /></a></div>
<br />
Head back to Excel and try make it reload the applets (simply try add a trailing slash to the url). Hopefully you should get the expected result.tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-86952023988573906712017-07-09T21:20:00.001+02:002017-07-09T21:20:14.792+02:00Error when installing Reporting Extensions for AX2012 R3 on SQL Server 2016<a href="http://yetanotherdynamicsaxblog.blogspot.com/2017/04/error-when-installing-reporting.html" target="_blank">On my previous post</a> I wrote on installing Reporting Extensions for AX2012 R3 on SQL Server 2014. In this post I want to emphasize that the same hotfix for SQL 2014 is needed for SQL 2016.<br />
The error behaves slightly different on SQL 2016 if you do not have the patch. The setup experience simply crashes during install and while the components is ticked as "installed" next time you run setup, it is only "half-baked". You need to start over, this time with the hotfix ready.<br />
<br />
Here is a screenshot of the installer crash with "AxSetup.exe has stopped working". Ignore that it is on the SSAS step, I simply chose to install both extensions at the same time. The error actually relates to Reporting Extensions.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-Kv520YL0qls/WWKAfp9533I/AAAAAAACGXk/8xU8CvBlFIw6efMZF6sh1E7i-gwsQki8ACLcBGAs/s1600/setup_crash_sql2016.png" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" data-original-height="580" data-original-width="746" height="310" src="https://4.bp.blogspot.com/-Kv520YL0qls/WWKAfp9533I/AAAAAAACGXk/8xU8CvBlFIw6efMZF6sh1E7i-gwsQki8ACLcBGAs/s400/setup_crash_sql2016.png" width="400" /></a></div>
<br />
<br />
And if you open the setup logs for further inspection, you will see it ends while trying to setup the SSRS bits. Here is an excerpt from the install log:<br />
<br />
<span style="font-family: Courier New, Courier, monospace;">2017-07-05 11:30:56Z<span style="white-space: pre;"> </span>Setting the SQL Server Reporting Services service account to the Microsoft Dynamics AX .Net Business Connector account.</span><br />
<span style="font-family: Courier New, Courier, monospace;">2017-07-05 11:30:56Z<span style="white-space: pre;"> </span>Generating database rights script.</span><br />
<span style="font-family: Courier New, Courier, monospace;">2017-07-05 11:30:56Z<span style="white-space: pre;"> </span>Opening connection to the database using the connection string server=SERVER\INSTANCE;Integrated Security=SSPI.</span><br />
<span style="font-family: Courier New, Courier, monospace;">2017-07-05 11:30:56Z<span style="white-space: pre;"> </span>Writing the database rights script to C:\Users\USERID\AppData\Local\Temp\3\tmpADC0.tmp.</span><br />
<span style="font-family: Courier New, Courier, monospace;">2017-07-05 11:30:56Z<span style="white-space: pre;"> </span>Executing database rights script.</span><br />
<div>
<br /></div>
<br />
I got this error even though the installation was slipstreamed with CU12, which is a later version, compared to the hotfix.<br />
<br />
So if you're planning on installing these bits for SQL 2016 (or SQL 2014), do yourself the favor of downloading the KB3216898 and slipstream your install by extracting it into your installation Update folder.<br />
<br />
Here is the link, again: <a href="https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ffix.lcs.dynamics.com%2FIssue%2FResolved%3Fkb%3D3216898%26bugId%3D3800976&data=02%7C01%7Cv-prsg%40064d.mgd.microsoft.com%7C89ef1e7fe10c4232011408d46d356d71%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C636253523632496506&sdata=HwH4EtVTI1eaEkEx3DLyoCw1ocOyYrpOtR4LQgj71%2FE%3D&reserved=0" style="background-color: white; color: #29aae1; font-family: Calibri, sans-serif; font-size: 11pt;">https://fix.lcs.dynamics.com/Issue/Resolved?kb=3216898&bugId=3800976</a>tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-30537424783539349122017-04-20T15:55:00.000+02:002017-04-20T15:55:06.318+02:00Error when installing Reporting Extensions for AX2012 R3 on SQL 2014 SSRSHi<br />
<br />
I could not really find any posts on this out there, so I decided to just share this here.<br />
<br />
You may experience installation errors when you try to install Reporting Extensions for AX2012 R3 on SQL 2014. The setup crashes internally, rolls back the installation and fails.<br />
In the installation log you will see the error "Version string portion was too short or too long".<br />
<br />
The solution is available on LCS as a downloadable hotfix KB3216898 (released 10th of January 2017) here:<br /><span style="font-family: "Times New Roman", serif; font-size: 12pt;"><a href="https://na01.safelinks.protection.outlook.com/?url=https%3A%2F%2Ffix.lcs.dynamics.com%2FIssue%2FResolved%3Fkb%3D3216898%26bugId%3D3800976&data=02%7C01%7Cv-prsg%40064d.mgd.microsoft.com%7C89ef1e7fe10c4232011408d46d356d71%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C636253523632496506&sdata=HwH4EtVTI1eaEkEx3DLyoCw1ocOyYrpOtR4LQgj71%2FE%3D&reserved=0" style="font-family: Calibri, sans-serif; font-size: 11pt;">https://fix.lcs.dynamics.com/Issue/Resolved?kb=3216898&bugId=3800976</a></span><br />
<br />
Unpack the content of the hotfix and slipstream it as part of your AX2012 R3 installation and run the installation again. Now it will work.<br />
<br />
Just to make this sure people find this post if they search for the errors, I'll add the full call stack below:<br />
<br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;">An error occurred during setup of Reporting Services extensions.</span><br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;">Reason: Version string portion was too short or too long.</span><br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;">System.ArgumentException: Version string portion was too short or too long.</span><br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;"><span class="Apple-tab-span" style="white-space: pre;"> </span> at System.Version..ctor(String version)</span><br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;"><span class="Apple-tab-span" style="white-space: pre;"> </span> at Microsoft.Dynamics.AX.Framework.Reporting.Shared.SrsWmi.get_ReportManagerUrl()</span><br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;"><span class="Apple-tab-span" style="white-space: pre;"> </span> at Microsoft.Dynamics.Setup.ReportsServerInstaller.GetOrCreateServerConfiguration(String instanceName, String sharePointServiceApplicationSite, Boolean& createdConfiguration)</span><br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;"><span class="Apple-tab-span" style="white-space: pre;"> </span> at Microsoft.Dynamics.Setup.Components.ReportingServicesExtensions.InstallConfigurationFiles(String instanceName, String sharePointServiceApplicationSite, Boolean& createdConfiguration)</span><br />
<span style="font-family: Courier New, Courier, monospace; font-size: x-small;"><span class="Apple-tab-span" style="white-space: pre;"> </span> at Microsoft.Dynamics.Setup.Components.ReportingServicesExtensions.RunReportingSetupManagerDeploy()</span><br />
<div>
<br /></div>
<br />
<br />
<br />
<br />
<br />
<br />tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-82400638364205722872016-12-26T18:13:00.000+01:002016-12-26T18:13:33.254+01:00Managing your Azure Subscriptions created through CSP portal<i>Let me start off with a disclaimer, as Microsoft may change the behavior, which would render this post obsolete. In which case I'll try to come back and make the necessary amendments. </i><br />
<br />
If you have worked with managing your Azure resources through PowerShell, you will notice that Azure Subscriptions created through the <a href="https://partner.microsoft.com/en-US/cloud-solution-provider" target="_blank">Cloud Solution Partner (CSP)</a> portal behaves slightly different. <a href="https://blogs.technet.microsoft.com/hybridcloudbp/2016/08/26/azure-subscription-migration-to-csp/" target="_blank">This post from august 2016</a> goes into details on how to migrate from "traditional" Azure Subscriptions to "CSP" Subscriptions. <br />
<br />
In my post, I want to just quickly show you some key points.<br />
<br />
<b><u>Azure Portal navigation</u></b><br />
<b><u><br /></u></b>
One thing you will quickly notice is that if you access the CSP portal and open the Azure Portal from there, all of the classic resource types in Azure are completely hidden. You can only create and operate on Azure Resource Manager (ARM) types of resources. So basically, this prevents you from using Azure Service Management API and any interface that assumes ASM, or "Classic Azure" as it is also named.<br />
<br />
Another thing you'll notice is that if you try to navigate the Azure Portal directly (<a href="https://portal.azure.com/" target="_blank">portal.azure.com</a>) you do not necessarily see the Azure Subscriptions from your tenants in the list of Directories. I say "necessarily" because if your user has been explicitly granted "owner" role on the tenant, that is a different story. One of the core features of the CSP program, is that the partner already is "owner" through the Foreign Principal role, more specifically all users who have the AdminRights permissions within the CSP portal. <a href="https://blogs.technet.microsoft.com/hybridcloudbp/2016/06/08/identity-and-rights-management-in-csp-model/" target="_blank">You can read more about that here</a>.<br />
<br />
So on order to navigate to the customers Azure resources you need to explicitly go the the tenant through the URL. That will open the tenants context and off you go. The URL will typically be something like this: https://portal.azure.com/<b>TENANTNAME.onmicrosoft.com</b> (or the customers own domain, if it is fully setup.<br />
<br />
<b><u>Azure PowerShell management</u></b><br />
<br />
What about PowerShell? Is that any different? YES!<br />
<br />
If you run Login-AzureRmAccount without setting a context, you'll end up only seeing Azure Subscriptions you have access to explicitly. And even for Azure Subscriptions created through CSP will behave differently.<br />
<br />
The solution is rather easy, even if you could argue it's a bit cumbersome. <br />You need to explicitly set the context.<br />
<br />
Here are some options available:<br />
<br />
<ul>
<li>You either explicit login to the tenant and subscription:<br /><span style="font-family: Courier New, Courier, monospace;">Login-AzureRmAccount -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID</span></li>
<li>Or login "normally" and then run select with tenant and subscription:<br /><span style="font-family: Courier New, Courier, monospace;">Select-AzureRmSubscription -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID</span></li>
<li>Or you could login and set context using the following command:<br /><span style="font-family: Courier New, Courier, monospace;">Get-AzureRmSubscription -TenantId TENANT_GUID -SubscriptionId SUBSCRIPTION-GUID | Set-AzureRmContext</span></li>
</ul>
<br />
If you do not set the context explicitly, you will not be able to operate over the Azure resources.<br />
<br />
Now, some readers may have noticed Azure Subscriptions created through CSP is inaccessible in the old Classic Azure Portal, which in turn disconnects such the Subscription from being available on Lice Cycle Services (LCS). LCS does support ARM by now, so I believe the solution should be just around the corner. We're just missing one minor piece for all of this to work together properly.<br />
<br />
Have a nice Christmas holiday, everyone!tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0tag:blogger.com,1999:blog-3369503819828617336.post-78688002961291695012016-10-23T20:44:00.000+02:002016-10-23T20:44:37.310+02:00Using the On-Premise Gateway to connect to your AX2012 data to Power BI PortalPowerBI has been around for a long time by now, so there are tons of information out there on how to connect your data sources to the powerful PowerBI Portal (<a href="http://www.powerbi.com/">www.powerbi.com</a>). Now, getting all the moving parts to connect properly might have been difficult at times, but I'm making this post to just reassure you it is currently <b>very easy to set up</b>.<br />
<br />
Before I begin, I just want to add a precaution: <br />
Consider the implications around security and performance when setting this up.<br />
<br />
I prefer to use a common service (or setup) account for this, and not my own consultant login. This makes it a little easier if someone else needs to step in and maintain the setup. Furthermore, it allows for the customer to lock down the credentials after I've completed the setup.<br />
As for performance, you should pay attention to how data refresh adds load to your servers, both the one hosting the gateway itself, the server hosting the data source (SQL Server and/or Analysis Services). You don't want to cause a full system overload while pulling data from your sources.<br />
<br />
I will use the standard Dynamics AX SSAS OLAP as an example, but the point here is less the data source, and more how easy it is to connect to the PowerBI Portal.<br />
<br />
Before we begin, I want to list some prerequisites, or at least how I would set it up:<br />
<br />
<ul>
<li>You are using a dedicated setup account and this account is a domain user</li>
<li>You are local admin on the server where you plan to setup the gateway. Basically, your setup account is listed in the Administrators Group (under Computer Management, Local Users and Groups, Groups, Administrators).</li>
<li>You have access to the SQL Server Analysis Services (SSAS) with your setup account. Check by right-click SSAS instance, choose Properties and look at the list of users under Security.</li>
<li>You have a user who is Global Admin in Azure AD. This could be the setup user, synced to Azure AD from the On-Premise domain, but it's not necessary. The point is this user will have access to setup things on PowerBI which currently requires Office 365 Global Admin rights. This may change in the near future, hopefully.</li>
</ul>
<div>
Given all of the above, you'll simply start by logging on the PowerBI portal using the Office 365 Global Admin user, and download what's called the "Data Gateway". The download link is in the top and takes you to the <a href="https://powerbi.microsoft.com/en-us/gateway/" target="_blank">download page</a>. Press Download and get the latest and finest version.</div>
<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-s2JayVrU3lI/WAztCwoV8fI/AAAAAAACF4s/1OSPKfSAk2sNmmV65SeTtR_8Vw0cIZU9QCLcB/s1600/download_gateway.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://4.bp.blogspot.com/-s2JayVrU3lI/WAztCwoV8fI/AAAAAAACF4s/1OSPKfSAk2sNmmV65SeTtR_8Vw0cIZU9QCLcB/s320/download_gateway.gif" width="275" /></a></div>
<br />
<br />
When you run this installer, it will ask you to login using the Office 365 Global Admin user (which will have access to register the gateway). Also, I am using the "Enterprise Gateway" option when installing. This allows me to schedule refresh from data sources based on SSAS. <br />
The gateway has its own set of prerequisite software, <a href="https://powerbi.microsoft.com/en-us/documentation/powerbi-gateway-onprem/" target="_blank">so have a look at those before you begin</a>.<br />
<br />
When the gateway is installed successfully, it now can be utilized to connect to <b>ANY </b>of the SSAS instances on the domain, given the network traffic is allowed and you connect with a user who has access to the SSAS instance. So your LIVE, TEST, DEV, and so on. <b>How cool is that?</b><br />
<br />
Next you would use the PowerBI Admin Portal to configure the Gateway and add your data sources.<br />
Head over to the Manage gateways and click "Add Data Source".<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-oxlXgl0HajM/WAzziKrj00I/AAAAAAACF48/KSLvo0eTvM8ID80Po2J2Bh2YK7W7ZgpRwCLcB/s1600/manage_gateways.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://2.bp.blogspot.com/-oxlXgl0HajM/WAzziKrj00I/AAAAAAACF48/KSLvo0eTvM8ID80Po2J2Bh2YK7W7ZgpRwCLcB/s320/manage_gateways.gif" width="194" /></a></div>
<br />
<br />
Fill in the form. Notice I am using the name of the server where SSAS is running and the name of the SSAS instance. I also use the domain user who has access to the SSAS Server itself. I also put in the name of the OLAP, Dynamics AX Initial.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-kCqB4ILGH5w/WAzz6xIR0rI/AAAAAAACF5A/t4XdWJCWYws4Cumbnbj8argz8g6mgeu2wCLcB/s1600/Add_Data_Source.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="434" src="https://4.bp.blogspot.com/-kCqB4ILGH5w/WAzz6xIR0rI/AAAAAAACF5A/t4XdWJCWYws4Cumbnbj8argz8g6mgeu2wCLcB/s640/Add_Data_Source.gif" width="640" /></a></div>
<br />
<br />
The data source should connect and confirm everything looks good for you to connect the data source and whatever it contains. Great!<br />
A lot of people get here fine, but the next part is something which was added just recently, well actually some months ago in the <a href="https://powerbi.microsoft.com/en-us/blog/power-bi-gateways-april-update/" target="_blank">2016 April update</a>.<br />
<br />
<b>Why is this update important?</b><br />
<br />
Given the scenario where you're trying to connect some on-premise SSAS with PowerBI in the cloud, who's to say you're fully synchronizing on-premise Active Directory with Azure Active Directory? What if your local domain doesn't map the users perfectly with the usernames in Azure AD? This is where the "Map User Names" comes into play. We can actually add string replace rules to the usernames, so if your users are not perfectly mapped between Azure AD and On-Premise domain, you can still get this to work.<br />
<br />
So in this example, I will assume the On-Premise domain is using a different domain name compared to the one used by Office 365 and Azure AD. On-Premise I imagine CONTOSO is actually fully qualified as contoso.eu.local, while in the cloud users are using contoso.com.<br />
<br />
Click the Data Source you need to be mapped. Right now, these settings are not shared across data sources, but hopefully they will add further administrative improvements to this.<br />
Open the list of Users and look at the bottom for the Map User Names button.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-159CP6ArjpQ/WAz5GNsQQ2I/AAAAAAACF5Q/av9SB6jot2QUE-CMygbU9bQ9w182bUFTQCLcB/s1600/Map_user_names.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="434" src="https://3.bp.blogspot.com/-159CP6ArjpQ/WAz5GNsQQ2I/AAAAAAACF5Q/av9SB6jot2QUE-CMygbU9bQ9w182bUFTQCLcB/s640/Map_user_names.gif" width="640" /></a></div>
<br />
<br />
This will slide in the setup for mapping of user names.<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-C8Fdzh9lWUg/WAz5moPXuNI/AAAAAAACF5U/GTLuV5hnkMA9uQm64fmI0Lh-EIk0ATWeQCLcB/s1600/map_user.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="486" src="https://4.bp.blogspot.com/-C8Fdzh9lWUg/WAz5moPXuNI/AAAAAAACF5U/GTLuV5hnkMA9uQm64fmI0Lh-EIk0ATWeQCLcB/s640/map_user.gif" width="640" /></a></div>
<br />
<br />
Notice in my example I am replacing the long username for the powerbiadmin@contoso.com with service-account-with-access-to-ssas@contoso.eu.local. So anytime I am logged in at the PowerBI portal with this powerbiadmin-user, and I try to access the data sources through the gateway, the user principal names will be "washed" through the mapping, and "magically" the credentials for that user will work On-Premise because the local domain sees a user it recognizes. Furthermore, I added another example of a user who locally is represented by u12345@contoso.eu.local, while in Azure AD is actually tommy@contoso.com. So if this user also tries to update or refresh data sources, the credentials will work locally.<br />
<br />
<b>What next?</b><br />
<br />
Well, you can click "Get Data", select "Database" and choose "SQL Server Analysis Services" and simply pick your preferred cube from one of your datasources and click "Connect". With the new dataset in place, you can schedule a refresh outside regular business hours. Like this:<br />
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://3.bp.blogspot.com/-gORjPZbaLu0/WAz8fpM3J0I/AAAAAAACF5s/kneqF9B6ok0n7pD9KTRN7B_l4uD9PihTQCLcB/s1600/get_data.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="66" src="https://3.bp.blogspot.com/-gORjPZbaLu0/WAz8fpM3J0I/AAAAAAACF5s/kneqF9B6ok0n7pD9KTRN7B_l4uD9PihTQCLcB/s320/get_data.gif" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://1.bp.blogspot.com/-yb17lvzcbYk/WAz8fXdyqsI/AAAAAAACF5k/zq80y16h6EwSuh-q3wydmpxbRlDonHkgwCLcB/s1600/Get_data_Databases.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="320" src="https://1.bp.blogspot.com/-yb17lvzcbYk/WAz8fXdyqsI/AAAAAAACF5k/zq80y16h6EwSuh-q3wydmpxbRlDonHkgwCLcB/s320/Get_data_Databases.gif" width="270" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://2.bp.blogspot.com/-jKtkPVcAPYw/WAz8fYLYfrI/AAAAAAACF5o/Vbhm2I90RhoutdMWlU_FTtJ88XXl-dGYgCLcB/s1600/Get_data_SSAS.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="316" src="https://2.bp.blogspot.com/-jKtkPVcAPYw/WAz8fYLYfrI/AAAAAAACF5o/Vbhm2I90RhoutdMWlU_FTtJ88XXl-dGYgCLcB/s320/Get_data_SSAS.gif" width="320" /></a></div>
<br />
<div class="separator" style="clear: both; text-align: center;">
<a href="https://4.bp.blogspot.com/-6M-IoTdF4Zk/WAz8fUdtlHI/AAAAAAACF5g/1jM6xg_LxM0gjQAB03l-V52utog-SIJBwCLcB/s1600/Get_Data_SSAS_Cubes.gif" imageanchor="1" style="margin-left: 1em; margin-right: 1em;"><img border="0" height="263" src="https://4.bp.blogspot.com/-6M-IoTdF4Zk/WAz8fUdtlHI/AAAAAAACF5g/1jM6xg_LxM0gjQAB03l-V52utog-SIJBwCLcB/s320/Get_Data_SSAS_Cubes.gif" width="320" /></a></div>
<br />
A couple of follow-up questions:<br />
<br />
Q) <i>What happens if I map two completely different users, who actually both exists both in Azure and On-Premise?</i><br />
A) You're the admin and while there are no features to prevent potential illogical mappings, you can map yourself into complete chaos - at your own or someone else despair.<br />
<br />
Q) <i>Do I need to map all users like this? </i><br />
A) Since the mapping is a simple string replace, you can replace similar parts of the username. Like replacing "@contoso.com" with "@contoso.eu.local". If you're lucky enough, this will be enough to fix most usernames. Also consider there may be a number of users who only will load the Reports, but who do not need access to actually reload the datasets with fresh data from the data sources. Surely, those users do not need to be mapped.<br />
<br />
Q) <i>How much time does it take to set this up?</i><br />
A) With some practice, and if the users are setup with permissions like described in the beginning of this post, I bet you can get this up, connected and working <b>within the hour</b>. The rest will be waiting for data to come through so you can start fill your beautiful reports and dashboards with powerful data.<br />
<br />
Q) <i>What if it fails horribly and I get stuck? :'-(</i><br />
A) <a href="https://community.dynamics.com/ax/f/33?version=Microsoft%20Dynamics%20AX%202012&pi53287=0&category=Reporting%20and%20BI" target="_blank">Use the community forum</a> and make sure to tag your question with "BI".tommy.skauehttp://www.blogger.com/profile/13189067870996112658noreply@blogger.com0