Monday, December 26, 2016

Managing your Azure Subscriptions created through CSP portal

Let me start off with a disclaimer, as Microsoft may change the behavior, which would render this post obsolete. In which case I'll try to come back and make the necessary amendments. 

If you have worked with managing your Azure resources through PowerShell, you will notice that Azure Subscriptions created through the Cloud Solution Partner (CSP) portal behaves slightly different. This post from august 2016 goes into details on how to migrate from "traditional" Azure Subscriptions to "CSP" Subscriptions.

In my post, I want to just quickly show you some key points.

Azure Portal navigation

One thing you will quickly notice is that if you access the CSP portal and open the Azure Portal from there, all of the classic resource types in Azure are completely hidden. You can only create and operate on Azure Resource Manager (ARM) types of resources. So basically, this prevents you from using Azure Service Management API and any interface that assumes ASM, or "Classic Azure" as it is also named.

Another thing you'll notice is that if you try to navigate the Azure Portal directly (portal.azure.com) you do not necessarily see the Azure Subscriptions from your tenants in the list of Directories. I say "necessarily" because if your user has been explicitly granted "owner" role on the tenant, that is a different story. One of the core features of the CSP program, is that the partner already is "owner" through the Foreign Principal role, more specifically all users who have the AdminRights permissions within the CSP portal. You can read more about that here.

So on order to navigate to the customers Azure resources you need to explicitly go the the tenant through the URL. That will open the tenants context and off you go. The URL will typically be something like this: https://portal.azure.com/TENANTNAME.onmicrosoft.com (or the customers own domain, if it is fully setup.

Azure PowerShell management

What about PowerShell? Is that any different? YES!

If you run Login-AzureRmAccount without setting a context, you'll end up only seeing Azure Subscriptions you have access to explicitly. And even for Azure Subscriptions created through CSP will behave differently.

The solution is rather easy, even if you could argue it's a bit cumbersome.
You need to explicitly set the context.

Here are some options available:

  • You either explicit login to the tenant and subscription:
    Login-AzureRmAccount -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID
  • Or login "normally" and then run select with tenant and subscription:
    Select-AzureRmSubscription -TenantId TENANT-GUID -SubscriptionId SUBSCRIPTION-GUID
  • Or you could login and set context using the following command:
    Get-AzureRmSubscription -TenantId TENANT_GUID -SubscriptionId SUBSCRIPTION-GUID | Set-AzureRmContext

 If you do not set the context explicitly, you will not be able to operate over the Azure resources.

Now, some readers may have noticed Azure Subscriptions created through CSP is inaccessible in the old Classic Azure Portal, which in turn disconnects such the Subscription from being available on Lice Cycle Services (LCS). LCS does support ARM by now, so I believe the solution should be just around the corner. We're just missing one minor piece for all of this to work together properly.

Have a nice Christmas holiday, everyone!

Sunday, October 23, 2016

Using the On-Premise Gateway to connect to your AX2012 data to Power BI Portal

PowerBI has been around for a long time by now, so there are tons of information out there on how to connect your data sources to the powerful PowerBI Portal (www.powerbi.com). Now, getting all the moving parts to connect properly might have been difficult at times, but I'm making this post to just reassure you it is currently very easy to set up.

Before I begin, I just want to add a precaution:
Consider the implications around security and performance when setting this up.

I prefer to use a common service (or setup) account for this, and not my own consultant login. This makes it a little easier if someone else needs to step in and maintain the setup. Furthermore, it allows for the customer to lock down the credentials after I've completed the setup.
As for performance, you should pay attention to how data refresh adds load to your servers, both the one hosting the gateway itself, the server hosting the data source (SQL Server and/or Analysis Services). You don't want to cause a full system overload while pulling data from your sources.

I will use the standard Dynamics AX SSAS OLAP as an example, but the point here is less the data source, and more how easy it is to connect to the PowerBI Portal.

Before we begin, I want to list some prerequisites, or at least how I would set it up:

  • You are using a dedicated setup account and this account is a domain user
  • You are local admin on the server where you plan to setup the gateway. Basically, your setup account is listed in the Administrators Group (under Computer Management, Local Users and Groups, Groups, Administrators).
  • You have access to the SQL Server Analysis Services (SSAS) with your setup account. Check by right-click SSAS instance, choose Properties and look at the list of users under Security.
  • You have a user who is Global Admin in Azure AD. This could be the setup user, synced to Azure AD from the On-Premise domain, but it's not necessary. The point is this user will have access to setup things on PowerBI which currently requires Office 365 Global Admin rights. This may change in the near future, hopefully.
Given all of the above, you'll simply start by logging on the PowerBI portal using the Office 365 Global Admin user, and download what's called the "Data Gateway". The download link is in the top and takes you to the download page. Press Download and get the latest and finest version.




When you run this installer, it will ask you to login using the Office 365 Global Admin user (which will have access to register the gateway). Also, I am using the "Enterprise Gateway" option when installing. This allows me to schedule refresh from data sources based on SSAS.
The gateway has its own set of prerequisite software, so have a look at those before you begin.

When the gateway is installed successfully, it now can be utilized to connect to ANY of the SSAS instances on the domain, given the network traffic is allowed and you connect with a user who has access to the SSAS instance. So your LIVE, TEST, DEV, and so on. How cool is that?

Next you would use the PowerBI Admin Portal to configure the Gateway and add your data sources.
Head over to the Manage gateways and click "Add Data Source".



Fill in the form. Notice I am using the name of the server where SSAS is running and the name of the SSAS instance. I also use the domain user who has access to the SSAS Server itself. I also put in the name of the OLAP, Dynamics AX Initial.



The data source should connect and confirm everything looks good for you to connect the data source and whatever it contains. Great!
A lot of people get here fine, but the next part is something which was added just recently, well actually some months ago in the 2016 April update.

Why is this update important?

Given the scenario where you're trying to connect some on-premise SSAS with PowerBI in the cloud, who's to say you're fully synchronizing on-premise Active Directory with Azure Active Directory? What if your local domain doesn't map the users perfectly with the usernames in Azure AD? This is where the "Map User Names" comes into play. We can actually add string replace rules to the usernames, so if your users are not perfectly mapped between Azure AD and On-Premise domain, you can still get this to work.

So in this example, I will assume the On-Premise domain is using a different domain name compared to the one used by Office 365 and Azure AD. On-Premise I imagine CONTOSO is actually fully qualified as contoso.eu.local, while in the cloud users are using contoso.com.

Click the Data Source you need to be mapped. Right now, these settings are not shared across data sources, but hopefully they will add further administrative improvements to this.
Open the list of Users and look at the bottom for the Map User Names button.



This will slide in the setup for mapping of user names.



Notice in my example I am replacing the long username for the powerbiadmin@contoso.com with service-account-with-access-to-ssas@contoso.eu.local. So anytime I am logged in at the PowerBI portal with this powerbiadmin-user, and I try to access the data sources through the gateway, the user principal names will be "washed" through the mapping, and "magically" the credentials for that user will work On-Premise because the local domain sees a user it recognizes. Furthermore, I added another example of a user who locally is represented by u12345@contoso.eu.local, while in Azure AD is actually tommy@contoso.com. So if this user also tries to update or refresh data sources, the credentials will work locally.

What next?

Well, you can click "Get Data", select "Database" and choose "SQL Server Analysis Services" and simply pick your preferred cube from one of your datasources and click "Connect". With the new dataset in place, you can schedule a refresh outside regular business hours. Like this:





A couple of follow-up questions:

Q) What happens if I map two completely different users, who actually both exists both in Azure and On-Premise?
A) You're the admin and while there are no features to prevent potential illogical mappings, you can map yourself into complete chaos - at your own or someone else despair.

Q) Do I need to map all users like this? 
A) Since the mapping is a simple string replace, you can replace similar parts of the username. Like replacing "@contoso.com" with "@contoso.eu.local". If you're lucky enough, this will be enough to fix most usernames. Also consider there may be a number of users who only will load the Reports, but who do not need access to actually reload the datasets with fresh data from the data sources. Surely, those users do not need to be mapped.

Q) How much time does it take to set this up?
A) With some practice, and if the users are setup with permissions like described in the beginning of this post, I bet you can get this up, connected and working within the hour. The rest will be waiting for data to come through so you can start fill your beautiful reports and dashboards with powerful data.

Q) What if it fails horribly and I get stuck? :'-(
A) Use the community forum and make sure to tag your question with "BI".

Monday, October 10, 2016

Using PowerShell to list the 10 most recent KBs installed in AX2012 R3

I was asked to get an overview of the most recent KBs installed on an AX2012 R3 environment.
One way to do this is using PowerShell.
We know each KB is deployed as a single model, and we know each model installed will have a unique ID. This ID is incremented automatically by the system, so why not just sort on it, in a descending fashion.

Look at the following PowerShell command:

axmodel | 
where {$_.Name -match 'KB'} | 
sort modelid -Descending | 
select modelid, displayname, version, description, details -first 10 | 
out-gridview

I assume the default environment picked up by the command "axmodel" (alias for Get-AXModel) is the one we want to query. You can put in a "-Verbose" after the command if you'd like to see some verbose information about what ModelStore database the command operates on.

I then add a filter on the list, making sure I only get models having a match on "KB" in the Name property. I could also have looked for the word "hotfix" in the Description property. The string is not case sensitive.

I then pipe the result to a sorting where I want the list sorted descending on the ModelId.

I select out only some of the columns, and I also pick out the first 10 of the result.

Finally I send the result to the GridView Window, just because I like to see the result in a nice window (which allow for resizing, column resizing, filtering, etc).

Notice also that I can have this entire command on multiple lines, and the trick is that PowerShell will allow this when you use Pipe (|) like I am doing above. Otherwise, a line-break is interpreted as execution of a command, so be aware of that when running a "multi-line" PowerShell command.

You can also use the example above to do other things, like looking for specific KBs and check if they are already installed.

Enjoy!

Friday, October 7, 2016

Kicking off a dedicated podcast with my friend, colleague and fellow MVP Fredrik

It is with great excitement I am happy to announce the Dynamics AX Podcast!

It is a common effort with MVP Fredrik Sætre to generate another channel where all you Dynamics AX geeks, both techies and funkies (yea, functional consultants) can tap into thoughts, ideas, tips, insights, fun AX stories, and the list goes on and on.

Bare in mind, we invite anyone who is willing to share their insights, and while we don't necessary have to agree on anything, the primary aim is to share thoughts and have fun!

I really hope you enjoy it, and please feel free to comment, either here on this blog, or on the Youtube channel videos. Feel free to Tweet questions and topics you'd like to be discussed.

The first Podcast is available here:

Thursday, July 7, 2016

Stored Procedure for listing who is running AX queries on the SQL Server

I want to share a small nugget I've created for listing who is currently running queries against AX at any point in time. It builds on two prerequisites, one of which is from one of my suggested answers on the Dynamics AX Community Forum and secondly a solution provided by Microsoft in their own blog.

For the sake of completeness I will include the "sp_lock3" stored procedure in this post, but it is only fair to include that I got it from the LearnSQLForum, posted by a Scott Wigham here:
http://forums.learnsqlserver.com/SqlServerTopic40.aspx

The sp_lock3 looks like this:

/*******************************************************
 Source:
 http://forums.learnsqlserver.com/SqlServerTopic40.aspx
 ******************************************************/

USE master
GO
IF ( SELECT OBJECTPROPERTY(OBJECT_ID('sp_lock3'), 'IsProcedure')) = 1
    DROP PROC dbo.sp_lock3
GO
CREATE PROC dbo.sp_lock3 (
    @spid1 INT = NULL /* Check only this spid; if this is NULL then all spids will be checked */ 
    , @spid2 INT = NULL /* and this spid; if this is not null, @spid1 must be not null as well */ 
)
AS

CREATE TABLE #locktable (
    spid SMALLINT
    , loginname NVARCHAR(128) 
    , hostname NVARCHAR(128)
    , dbid INT
    , dbname NVARCHAR(128) 
    , objId INT
    , ObjName  NVARCHAR(128)
    , IndId INT
    , Type NVARCHAR(4)
    , Resource NVARCHAR(16)
    , Mode NVARCHAR(8)
    , Status NVARCHAR(5)
)

SET NOCOUNT ON

IF @spid2 IS NOT NULL AND @spid1 IS NULL
    SET @spid1 = @spid2

DECLARE @object_id INT,
  @dbid INT,
  @DynamicSql NVARCHAR(255)

/***** @spid1 is  provided so show only the locks for @spid1 and @spid2 *****/ 
IF @spid1 IS NOT NULL
    INSERT #locktable ( spid, loginname, hostname, dbid, dbname, objId, ObjName, IndId, Type, Resource, Mode, Status )
    SELECT CONVERT (SMALLINT, l.req_spid) 
        , COALESCE(SUBSTRING (s.loginame,  1,  128), '')
        , COALESCE(SUBSTRING (s.hostname,  1,  128), '')
        , l.rsc_dbid
        , SUBSTRING (DB_NAME(l.rsc_dbid),  1,  128)
        , l.rsc_objid
        , ''
        , l.rsc_indid
        , SUBSTRING (v.name,  1,  4)
        , SUBSTRING (l.rsc_text,  1,  16)
        , SUBSTRING (u.name,  1,  8)
        , SUBSTRING (x.name,  1,  5)
    FROM master.dbo.syslockinfo l JOIN master.dbo.spt_values v
        ON l.rsc_type = v.number
    JOIN master.dbo.spt_values x
        ON l.req_status = x.number
    JOIN master.dbo.spt_values u
        ON l.req_mode + 1 = u.number
    JOIN master.dbo.sysprocesses s
        ON l.req_spid = s.spid
    WHERE v.type = 'LR' AND x.type = 'LS'      AND  u.type = 'L' AND l.req_spid in (@spid1, @spid2) and l.rsc_dbid not in (32767)

ELSE /***** @spid1 is not provided so show all the locks *****/ 
    INSERT #locktable ( spid, loginname, hostname, dbid, dbname, objId, ObjName, IndId, Type, Resource, Mode, Status )
    SELECT CONVERT (SMALLINT, l.req_spid) 
        , COALESCE(SUBSTRING (s.loginame,  1,  128), '')
        , COALESCE(SUBSTRING (s.hostname,  1,  128), '')
        , l.rsc_dbid
        , SUBSTRING (DB_NAME(l.rsc_dbid),  1,  128)
        , l.rsc_objid
        , ''
        , l.rsc_indid
        , SUBSTRING (v.name,  1,  4)
        , SUBSTRING (l.rsc_text,  1,  16)
        , SUBSTRING (u.name,  1,  8)
        , SUBSTRING (x.name,  1,  5)
    FROM master.dbo.syslockinfo l JOIN master.dbo.spt_values v
        ON l.rsc_type = v.number
    JOIN master.dbo.spt_values x
        ON l.req_status = x.number
    JOIN master.dbo.spt_values u
        ON l.req_mode + 1 = u.number
    JOIN master.dbo.sysprocesses s
        ON l.req_spid = s.spid
    WHERE v.type = 'LR' AND x.type = 'LS' AND u.type = 'L' and l.rsc_dbid not in (32767)

/**********************************************************************************************
    Because the locks exist in any database, you must USE  before running OBJECT_NAME 

    We use a dynamic SQL loop to loop through each row from #locktable

    A temp table is required here since SQL Server 2000 cannot access a table variable when issuing dynamic sql
**********************************************************************************************/
-- Initialize the loop
SELECT TOP 1 @dbid = dbid, @object_id = ObjId FROM #locktable WHERE Type ='TAB' AND ObjName = ''

WHILE @dbid IS NOT NULL
    BEGIN
        SELECT @DynamicSql = 
         'USE ' + DB_NAME(@dbid) + char(13) 
         + 'UPDATE #locktable SET ObjName = OBJECT_NAME(' 
            + CONVERT(VARCHAR, @object_id) + ') WHERE dbid = ' + CONVERT(VARCHAR, @dbId) 
         + ' AND objid = ' + CONVERT(VARCHAR, @object_id) 

        EXEC sp_executesql @DynamicSql

        SET @dbid = NULL -- TSQL preserves the "old" value unless you initialize it to NULL
        SELECT @dbid = dbid, @object_id = ObjId FROM #locktable WHERE Type ='TAB' AND ObjName = ''
    END

SELECT * FROM #locktable 
    WHERE objname NOT LIKE '#locktable_____%' -- don't return this temp table 
        AND objid > 100 -- do not return system table locks
        AND objname <> 'spt_values'
GO

If you first install this procedure, simply by running it on the SQL Server instance, it will be globally available on that instance.

Next step is to run the script below to install my "sp_whoInAx":

/************************************************************************************
sp_whoInAx    This script lists out current users running queries against 
      any Dynamics AX database on this SQL Server Engine instance.
      Please report any issues and improvements back to the author.
      
Witten By:    Tommy Skaue (email: add the @-sign between first and last name, and end with .com)
      Microsoft Dynamics AX MVP
      yetanotherdynamicsaxblog.blogspot.com
      
Version:    1.0
      
Comments:    This procedure requires sp_lock3 in order to work.
      
This script is presented "AS IS" and has no warranties expressed or implied!!!
**********************************************************************************/

USE [master]
GO
IF ( SELECT OBJECTPROPERTY(OBJECT_ID('sp_whoInAx'), 'IsProcedure')) = 1
    DROP PROC [dbo].[sp_whoInAx] 
GO

CREATE PROCEDURE [dbo].[sp_whoInAx] 
AS

 CREATE TABLE #LOCKTABLE (
  SPID SMALLINT
  , LOGINNAME NVARCHAR(128) 
  , HOSTNAME NVARCHAR(128)
  , DBID INT
  , DBNAME NVARCHAR(128) 
  , OBJID INT
  , OBJNAME  NVARCHAR(128)
  , INDID INT
  , TYPE NVARCHAR(4)
  , RESOURCE NVARCHAR(16)
  , MODE NVARCHAR(8)
  , STATUS NVARCHAR(5)
 )

 INSERT INTO #locktable
 EXEC sp_lock3

 CREATE TABLE #CurrentAxSessions (
  CI VARCHAR(128) 
  , HOST_NAME VARCHAR(128)
  , SESSION_ID smallINT
  , DATABASE_ID smallINT
  , LOGIN_TIME datetime
  , STATUS VARCHAR(30)    
 )

 INSERT INTO #CurrentAxSessions
 SELECT 
  CAST(CONTEXT_INFO AS VARCHAR(128)) AS CI
  , HOST_NAME
  , SESSION_ID
  , DATABASE_ID
  , LOGIN_TIME
  , STATUS
  --, STATUS,  CPU_TIME,MEMORY_USAGE, TOTAL_SCHEDULED_TIME, TOTAL_ELAPSED_TIME
  --, LAST_REQUEST_START_TIME, LAST_REQUEST_END_TIME, READS, WRITES, LOGICAL_READS,  OPEN_TRANSACTION_COUNT
 FROM SYS.DM_EXEC_SESSIONS   
 WHERE 1=1
  AND PROGRAM_NAME LIKE '%DYNAMICS%'
  AND CAST(CONTEXT_INFO AS VARCHAR(128)) <> ''
  
 SELECT 
  AX.CI
  ,AX.LOGIN_TIME
  ,AX.SESSION_ID
  ,AX.STATUS
  ,LT.HOSTNAME
  ,LT.DBNAME
  ,LT.OBJNAME 
 FROM #CurrentAxSessions AX
  INNER JOIN #locktable LT ON LT.DBID = AX.DATABASE_ID AND LT.SPID = AX.SESSION_ID

GO

It takes the result of sp_lock and combines it with the "AX sessions". Again, it does require that registry change mentioned in Microsofts blog, in addition to a restart of the AOS service.

To reiterate the steps for including the user in the session context, they are:

  1. Navigate to HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\services\Dynamics Server\6.0\01\Original (installed configuration). The last key, Original (installed configuration), is the key name for the current server configuration. If your system uses a different configuration that the original installed configuration, navigate to the currently active configuration.
  2. Create a string registry value called ‘connectioncontext’ and set the value to 1.
  3. Restart the AOS.
Now you can run sp_whoInAx from any query window on that SQL Server instance, and see from the context the user id and also see what database and tables are involved. 

Friday, March 11, 2016

Reporting Server Subscription does not run on Schedule

I've setup a SSRS report that pulls data from two SQL Server databases across two different SQL Server instances. It runs under a SQL Server user and works like charm when run manually. However, when I try to setup a Subscription for this report, the report never runs according to the schedule. I don't even get an error in the Reporting Services log.

In order to understand what is going on, I need to go back to the SQL Server Agent and check the logs there. When you create a Report Server Subscription on a Schedule it creates a new Job for this Schedule and this Job initiates the Event that triggers the scheduled report(s) to run.You can read more about it here.

First I need to identify what Job ID is behind the schedule I want to investigate. The Jobs are created using a Unique ID (GUID) and in order to link the actual report to its schedule ID I run this SQL on the Reporting Server Database (Normally called "ReportServer_INSTANCENAME"):

select s.Description, us.UserName, s.LastStatus, s.LastRunTime, c.Path, uc.UserName, rs.ScheduleId,
from ReportServer.dbo.Subscriptions s
join ReportServer.dbo.Catalog c on c.ItemID = s.Report_OID
join ReportServer.dbo.ReportSchedule rs on rs.SubscriptionID = s.SubscriptionID
join ReportServer.dbo.Users uc on uc.UserID = c.ModifiedByID
join ReportServer.dbo.Users us on us.UserID = s.OwnerId

Going back to the Job, I can see from the Job History it doesn't even get to the first step of the Job. Since this is AX, the Job is run under the same account as the Business Proxy Account. That is how SSRS normally is configured in relation to AX. The error from the history log actually says "the owner of job does not have server access". The job is actually set to run in the context of a user that does have server access, but somehow (a bug maybe) this is ignored by the SQL Server Agent.

The solution is:
  1. Change owner of the job to the SQL Server User that has the necessary permissions to run the report
  2. Grant the SQL Server User permissions to operate on the the Report Server Database (It does needs permissions to inject data, however I gave it full db_owner permissions)
  3. Test the job by doing a manual run
If the job still fails, you can investigate the Job history for any errors. 

Friday, March 4, 2016

Login failed error while processing OLAP cubes

I was requested to help with solving a processing error from SQL Server Analysis Services today.

While processing the engine fails to retrieve data from the Dynamics AX transaction database and throws the following error:
Login failed for user "DOMAIN\SERVERNAME$". Reason: Could not find a login mathcing the name provided (CLIENT: )"
(DOMAIN refers to the actual Active Directory Domain Name and SERVERNAME$ refers to the name of the server.



From the error it seems like the machine account is unable to login. Now in my scenario the SSAS service runs on the same server as the SQL Server Engine and the instance with the Dynamics AX database. In addition the SSAS service is NOT running using a dedicated Domain Service Account, but rather a local service account. The error then becomes sort of misleading if you read off the account name, because it is actually referring to machine account name.

The solution is simple, though.

Open the SQL Server Configuration Manager, find the Analysis Service you are using when processing the OLAP.


Open the properties and copy out the "Account name" which the service runs under. Normally this would be something like "NT Service\MSOLAP$INSTANCE_NAME" (INSTANCE_NAME refers to the name of the SSAS instance).


Now open SQL Server Management Studio, open Security and Logins. Add a new Login and paste in the Account name from previous step as the Login name.



Before you save this new login, open "User Mappings" and find the Dynamics AX database you are trying to use as source for your OLAP. Tick the checkbox in front of the database name and grant the login "db_datareader" role membership.


Now save the Login and jump back to the processing step and try run it again.