Tuesday, June 10, 2014

Synchronization error when setting up AX2012 R3

This will be a quick post on an error I had when setting up a fresh AX2012 R3 installation. I was getting a bit ahead of myself and got stuck with an error when synchronizing the database.


And to help search engines find this post, the error in clear text:

Cannot create a record in Inheritance relation (SysInheritanceRelations). MainTableId: 12345.
The record already exists. 

The error is due to the fact that I didn't restart the AOS after compiling the application. Lesson is; don't rush and do it the right way the first time.
Furthermore, if you DO get stuck, try do a quick search on Life Cycle Services for a solution:
https://fix.lcs.dynamics.com/Issue/Results?q=SysInheritanceRelations

Restarted the AOS and continued the installation.

Thursday, June 5, 2014

Delete Company in AX 2009 using SQL

One of the potential tasks when upgrading to a new version of Dynamics AX (like from AX2009 to AX2012) is getting rid of obsolete companies. Microsoft shared a SQL for this a few years back. I enhanced it a little bit and added some additional statements.

Just one important remark - DO NOT RUN THIS AGAINST AX2012!

In the interest of sharing, here it is:

/********************************************************
 REMOVE COMPANYID IN DYNAMICS AX 2009

 USE AT OWN RISK! 

 MAKE SURE YOUR TRANSACTION LOG IS PERMITTED TO GROW
 
 Inspired by: 
 http://blogs.msdn.com/b/emeadaxsupport/archive/2010/12/09/how-to-delete-orphaned-data-remained-from-deleted-company.aspx

 Tommy Skaue http://yetanotherdynamicsaxblog.blogspot.com/
*********************************************************/ 

DECLARE @_TABLENAME NVARCHAR(40)
DECLARE @_COMPANYID NVARCHAR(4)

SET @_COMPANYID = N'TST';  -- COMPANY TO DELETE

DECLARE CURSQLDICTIONARY CURSOR FOR
SELECT A.SQLNAME
 FROM SQLDICTIONARY A
  INNER JOIN SQLDICTIONARY X ON X.TABLEID = A.TABLEID AND X.FIELDID = 61448
   WHERE A.FIELDID = 0
    AND A.FLAGS = 0

OPEN CURSQLDICTIONARY

FETCH NEXT FROM CURSQLDICTIONARY INTO @_TABLENAME

WHILE @@FETCH_STATUS = 0
BEGIN
 DECLARE @_SQL NVARCHAR(4000)
 SET @_SQL = N'DELETE FROM ' + QUOTENAME(@_TABLENAME) + N' WHERE DATAAREAID = @_DATAAREAID'
 PRINT (CHAR(13) + 'Removing ' + @_COMPANYID + ' from ' + @_TABLENAME + '...')
 EXEC SP_EXECUTESQL @_SQL, N'@_DATAAREAID NVARCHAR(4)', @_DATAAREAID = @_COMPANYID  
 
 FETCH NEXT FROM CURSQLDICTIONARY INTO @_TABLENAME
END

PRINT (CHAR(13) + 'Finalizing...')
DELETE FROM DATAAREA WHERE DATAAREA.ID = @_COMPANYID
DELETE FROM COMPANYDOMAINLIST WHERE COMPANYDOMAINLIST.COMPANYID = @_COMPANYID
DELETE FROM VIRTUALDATAAREALIST WHERE VIRTUALDATAAREALIST.ID = @_COMPANYID
PRINT (CHAR(13) + 'Done!')
CLOSE CURSQLDICTIONARY
DEALLOCATE CURSQLDICTIONARY

Use at own risk (of course), and let me know if you find any issues with it.

Saturday, May 31, 2014

Start and Stop your AX2012 R3 Azure Demos using PowerShell

There are tons of blog posts covering how to use PowerShell to control your Azure components, services, machines, etc. Once you download and install PowerShell for Azure and try it yourself, you see how easy you can manage Azure by running some simple commands. This post is here to help you AX'ers who have already started playing with Life Cycle Services and Cloud hosted AX2012 R3 Demos. You should be aware that each demo you create will cost money while it is running, so here is a post on how you can easily start and stop them using simple scripts instead of doing it through the Azure management portal.

I am going to assume your user is admin or co-admin on Azure, or you at least have credentials for such a user. I'm also going to assume you've downloaded PowerShell for Azure and installed it.

Start by opening PowerShell for Azure.


Run this command to authenticate for the next 12 hours:

Add-AzureAccount

You will be prompted for credentials, and you need to provide the same credentials you would be using if you were authenticating against Azure.

Use this command to Start demos:

Get-AzureService | Where-Object {$_.Label -match "AX2012R3"} | `
Foreach-Object { Start-AzureVM -ServiceName $_.ServiceName -Name "*" –Verbose }

If you change the filter part ("AX2012R3"), you could control a subset of your services based on some naming pattern or convention.

Use this command to Stop the demos:

Get-AzureService | Where-Object {$_.Label -match "AX2012R3"} | `
Foreach-Object { Stop-AzureVM -ServiceName $_.ServiceName -Name "*" –Force –Verbose }

Isn't that cool?

Now for the bonus part. When you create these demos, you access them through RDP-files. Here is a PowerShell command to download the RDP-files and save them to disk:

Get-AzureService | Where-Object {$_.Label -match "AX2012R3"} | `
Get-AzureRole -InstanceDetails | Where-Object {$_.RoleName -Match "DEMO" } | `
Foreach-Object { Get-AzureRemoteDesktopFile -ServiceName $_.ServiceName -Name $_.InstanceName `
-LocalPath (Join-Path "C:\Temp" ($_.InstanceName + ".rdp")) }

Pretty neat, ey? :-)

Monday, April 28, 2014

Upgrading AX2012 R2 to AX2012 R3

Dynamics AX2012 R3 is just a few days away, so I am thrilled to share some information around the upgrade story from AX2012 R2 to AX2012 R3. I realize most people will probably upgrade from RTM or pre-AX2012, but there might be a few of you who plan to upgrade from R2.

The upgrade story from AX2012 RTM (R0/R1) will be more or less the same as when upgrading to R2. The significant part was the fact the entire Dynamics AX Database was split in two, one database for business data and one for the application (aka Modelstore).

What is interesting with the R2 to R3 upgrade is that we now have two databases to begin with. Microsoft has solved this by doing the following:

1) Create a R3 Upgrade Baseline named YOURDB_UpgradeR3Baseline
2) Export the R2 Modelstore to disk
3) Import the R2 Modelstore to the R3 Upgrade Baseline
4) Replace all the Microsoft models in the actual modelstore with new shiny fresh R3 models!



From there you have more or less the same upgrade story as you would have when upgradring RTM to R2 or R3. The story is divided in two stages:

1) Code upgrade
2) Data upgrade

If you've done R2 upgrades, you will remember that code upgrade is recommended to do this in a isolated environment. It doesn't have to be on a dedicated server, as long as you know how to keep the environments separated. The code upgrade should be done on a copy of the newly configured R3 Modelstore, and the whole idea is to make sure you end up with an upgraded R3 modelstore that keeps the element IDs. Otherwise, you'll end up with having to start over.

The key thing to understand here is that setup will install and replace the Microsoft models in SYS and SYP (plus some other MS layers if applicable). You will need to have R3 compatible ISV models before you start. Your job will be handling the rest of the layers.



The code upgrade is done in a cyclic manner, and the upgrade guide explains it beautifully;

1) Import modelstore
2) Delete top most layers
3) Upgrade current layer
4) Export current layer models
5) Back to step 1 and start working on next layer.

You will do this from the VAR-layer, through CUS and finally USR.

One of the new features in R3 is the improved Code Automerge feature, and for this you will need to install the Team Explorer for Visual Studio 2012. It is not part of the prerequisites as of this writing.

Just to give you an idea on what you can expect when installing R3 during an in-place upgrade, I will provide some screenshots.

I will assume you've duplicated your R2 databases and you've prepared a new server. Obviously you won't be able to mix 6.2 dlls with 6.3 dlls, so you need a new server in order to do the in-place upgrade.

First you choose a Database upgrade:


You then choose to Configure an existing database:


You point to your prepared, but yet not upgraded R2 database. Do not fill in Baseline, because Setup will make one for you.


You will need to provide a location for where the exported modelstore from R2 will be saved.

When this part is completed, you will install the AOS, a Client, Debugger and Management Utilities - all tools necessary for compiling. The debugger will be used for the code upgrade part. Remember I chose to do both the code and the data upgrade on the same server.


Now, I did select the newly created R3Baseline for the first AOS installed, but only to show you the naming convention Microsoft used for the database. The baseline database comes to play when you are in the code upgrade phase. So I did eventually reconfigure my second AOS to use the baseline when I upgraded my layers to R3.


Finally, you define the Instance and ports.


The upgrade for AX2012 is pretty intense, and a lot of steps and details to remember. Lucky for us, the current upgrade guide is very well written and guides you through the process. I expect Microsoft to publish a comprehensive whitepaper for the R3 upgrade story, covering both source to target upgrades, but also RTM and R2 to R3 upgrades. Take the time to read it and understand it. And if you have questions, don't be afraid to ask on the community site.

Finally, I just want to add there a many neat new things in R3 from the technical perspective, so get excited. :-)

Wednesday, April 23, 2014

Mark Compile application step complete in checklists

Since AX2012 R2 CU7, most of us have started to use axbuild for a complete application compilation. While this is fine and dandy, there are still checklists within AX that demands this step to be run in the client itself. Until we get an option to "Mark as complete", you can mark the step yourself using the following job:

static void CompleteCompile(Args _args)
{
    SysCheckList::finished(classnum(SysCheckListItem_Compile));
    SysCheckList::finished(classnum(SysCheckListItem_CompileUpgrade));
    SysCheckList::finished(className2Id(classStr(SysCheckListItem_CompileServ)));
    SysCheckList::finished(classnum(SysCheckListItem_SysUpdateCodeCompilInit));
}

Run the job and observe the step is marked as completed. No sweat!

Wednesday, April 16, 2014

Error when importing Model downloaded from Internet

If you were to download my compressed AX Model, YetAnotherDynamicsAXModel.zip, to your machine, Windows will most likely tag the downloaded file as unsafe and "blocked".

This is perfectly normal and expected behavior. However, this cause problems if you were to try install the model. After unzipping the content and attempting to install it you would get this error:

CategoryInfo : OperationStopped: (:) [Install-AXModel], PipelineStoppedException
FullyQualifiedErrorId : Exception has been thrown by the target of an invocation.,Microsoft.Dynamics.AX.Framework.Tools.ModelManagement.PowerShell.InstallAXModelCommand




If you were to attempt peek inside the model using Get-AxModel you would get this error:

CategoryInfo : OperationStopped: (:) [Get-AXModel], PipelineStoppedException
FullyQualifiedErrorId : An attempt was made to load an assembly from a network location which would have caused the assembly to be sandboxed in previous versions of the .NET Framework. This release of the .NET Framework does not enable CAS policy by default, so this load may be dangerous. If this load is not intended to sandbox the assembly, please enable the loadFromRemoteSources switch. See http://go.microsoft.com/fwlink/?LinkId=155569 for more information. ,Microsoft.Dynamics.AX.Framework.Tools.ModelManagement.PowerShell.GetAXModelCommand

While the first error isn't really intuitive, the next error gives us an additional clue; "assembly from a network location".

The solution is simple - unblock!

You have the choice to either unblock the downloaded zip.



Or you will have to unblock the AX Model file itself.



After the file is marked as safe and "unblocked", you are good to go!

Thursday, April 10, 2014

Warm up Reporting Services for quicker reporting

You might already know AX2012 R2 CU7 brought a new helper class for warming up Reporting Services so it doesn't "fall asleep" during long periods of inactivity. The new class has already been blogged about, but in this post I will show to set up the batch and also give you some hints on how to find potential reports you can target if you want to extend the class.

Setting up the batch

Now, I'm going to set this up for every 10 minutes, and not every minute as some might suggest.

The class is dependent on a SSRS report that needs to be deployed, so if you haven't done so already, find the report and deploy it.


After making sure the report is available, you need to locate the class in the AOT. Open the class to run it. Choose batch and define recurrence and alerts.



When the report runs, it will save the result as a PDF to the temporary folder on the AOS. You may open the report and view it, if you're really interested. The point is not the report as much as keeping the service "warm" and also provide a pattern for warming up other potentially slow running reports.



You may want to make sure you don't get a log entry for each successful batch run. I prefer to keep Errors Only for these types of frequent jobs.



Finding potential reports for extending the warmup

Reporting Services logs the execution of the reports and there are a lot of good statistics you can use to investigate potential performance issues. For this example I will run a SQL Query that gives me frequently run reports and load some metrics. I'm looking for reports run the last month with more than 25 runs and where there are more zero rows. These reports could be reports I would want to keep warm.

SELECT 
  COUNT(REPORTPATH) AS RUNS, 
  REPORTPATH, 
  MAX(TIMEDATARETRIEVAL) AS MAX_DATA ,
  MAX(TIMEPROCESSING) AS MAX_PROCESSING,
  MAX(TIMERENDERING) AS MAX_RENDER,
  MIN(TIMEDATARETRIEVAL) AS MIN_DATA ,
  MIN(TIMEPROCESSING) AS MIN_PROCESSING,
  MIN(TIMERENDERING) AS MIN_RENDER
FROM EXECUTIONLOG2 
  WHERE 
     STATUS IN ('RSSUCCESS') 
  AND REPORTPATH NOT IN ('UNKNOWN')
  AND TIMESTART > DATEADD(m,-1,GETDATE())
  AND BYTECOUNT > 0 AND [ROWCOUNT] > 0
GROUP BY REPORTPATH
HAVING COUNT(REPORTPATH) > 25
ORDER BY COUNT(REPORTPATH) DESC

Here are a snippet of the results. You can clearly tell even the warmup report has different metrics for MAX and MIN. These are milliseconds, but if you see reports spending several thousands of milliseconds processing or rendering just a few rows, you may want to investigate why.



Coming from pretty quick and performant MorphX reports to SSRS might be painful both for us AX consultants and AX end users, but then we also see some reports perform stunningly if they were run just minutes ago. You would think they should run just as slow each time. They don't, so maybe we should figure out why that is. Keeping Reporting Services warm is part of the solution.