Exporting multiple entities through DMF fails randomly

We’re exporting different portions of data from our Dynamics 365 F&O to a BYOD in different frequencies.

On several occasions – one might even call it frequently – we’ve been seeing the export failing on some entities (it changes in waves) with a limited level of error messages to troubleshoot from:

Error text:

Export execution failed for entity <Entity name>: Export failed while copying data from staging to destination

The result was the target database having changed records twice in the target table. A before and an after record.

The solution appears to be splitting the export job into a 1:1 setup with only one entity in each export.

I wish I had a more detailed explanation to this, but in this case it’s an I-don’t-see-the-bug-any-more-so-it-might-be-gone kind of thing.

Unpacking bacpac file throws error

When moving data from a sandbox environment to your dev-box you need sqlpackage to unpack the bacpac file to a database. This could give you an error message stating that the file contains corrupted data:

Skærmbillede 2020-03-24 kl. 12.49.06.png

To get yourself passed this you can download the .net core edition of the SQLPackage from here:


Unzip the downloaded file to a folder of your flavor. In that folder there’s another version of the sqlpackage.exe and that should allow you to continue:

Skærmbillede 2020-03-24 kl. 12.52.33.png


After this was posted a new bug started appearing:

The error text is:

*** Error importing database:Could not import package.

Warning SQL0: The source contains users that rely on an external authentication provider that is not supported by the target. These users will be treated as users without logins.

Error SQL72014: Core Microsoft SqlClient Data Provider: Msg 102, Level 16, State 30, Line 1 Incorrect syntax near ‘type’.

Error SQL72045: Script execution error. 

The solution is the same as the above mentioned but just a newer version of the SQLPACKAGE software:


Visual Studio – run as administrator

This is something that I generally forget whenever I create a new dev-box. Visual Studio needs to be running as Administrator otherwise you’ll get messages such as “Access to the path ‘K:\AosService\PackagesLocalDirectory\Bin\InstalledVersion.json’ is denied”:

Skærmbillede 2020-03-23 kl. 07.35.17.png

and “Visual Studio is not running as administrator. Finance and Operations (Dynamics 365) requires Visual Studio to be running as administrator. Please restart Visual Studio as administrator”:

Skærmbillede 2020-03-23 kl. 07.35.28.png

You can always right-click on your VS-shortcut and select “Run as administrator”, but if you’re even remotely like me, you’ll forget that approx 82% of the times.

So what you need to do is to modify the shortcut you’re using. In my case I modify the shortcut in my Taskbar. Right click on the shortcut for Visual Studio and instead of clicking “Run as administrator” you click Properties:

Skærmbillede 2020-03-23 kl. 09.27.35.png

In the properties window you click the Advanced button:

Skærmbillede 2020-03-23 kl. 09.29.37.png

In the window that opens up tick the “Run as administrator”:

Skærmbillede 2020-03-23 kl. 09.30.03.png

Click OK and OK and you’re done. Next time you use the shortcut it runs VS as administrator.

Azure SQL firewall access

Some things were easier in the good old days before Dyn365 and Azure and some are better. Often the things that were better before tends to be things we need to relearn.

Allowing firewall access to Azure SQL (for example regarding BI) is easily scripted and here are 3 useful scripts for that purpose:

List current firewall rules:

SELECT * FROM sys.firewall_rules

If you need to grant access to the IPs to

EXEC sp_set_firewall_rule N'The_Name_That_Will_Be_Shown_On_The_List', '', '';
Getting rid of the above rule is just as simple:
EXEC sp_delete_firewall_rule N'The_Name_That_Will_Be_Shown_On_The_List'

The three above is all server rules so you need to run them towards the master database.

You can do the same tricks on database level:


SELECT * FROM sys.database_firewall_rules


EXEC sp_set_database_firewall_rule N'The_Name_That_Will_Be_Shown_On_The_List', '', '';


EXEC sp_delete_database_firewall_rule N'The_Name_That_Will_Be_Shown_On_The_List'


Remember to be run them on the desired database.


Please keep in mind that you are messing with security and I – as usual – don’t take any responsibility in any damage you might make by using the above.

Prompting for values in SQL script

This is something that I forget 2 seconds after I’ve used it so now I’ll put it here as a note to self.

I have a series of TSQL scripts that I use once in different circumstances and often it’s a while between each go. In some of the I need to add some environment specific information and instead of having to scroll down through the text the query editor offers a way of prompting for parameters through a tagging of text like this:

select <Parameter name, Data type, Default value>

Before executing the script you press CTRL+SHIFT+M (this is the part I usually forget) and that makes the editor prompt you like this:

Screenshot 2019-02-19 at 13.49.43

Replace the Default value with the appropriate text and click OK. This will update your SQL script to this:

select 1234

Not the most mind-blowing thing in the world, but a great help.


Enable developer mode in the MPOS without rebuilding

In the Dynamics 365 POS it’s possible to enable a developer mode that gives you some additional options like seeing the id of the strings, grid views, coloring aids and so on. To enable these features you need to rebuild the POS with the value config.isDebugMode to true in the pos.js file. That’s easy in the CPOS but requires some build/deploy in MPOS.

Our POS developer gave me a tip to shortcut that … a lot. It’s a few easy steps and they go like this:

Enable Developer mode in Windows. That’s done in settings on the box running the POS

Developer mode Windows

Start the POS and hit F12 to enable the developer aid

In the bottom of the console type in Commerce.Config.isDebugMode=true and press Enter

Set debug mode

Go to your POS settings to enable the Developer mode flag in here.

Developer mode in POS

This will make some things way easier. Notice, that when done like this it is only for the current session. Doing it in the pos.js file-way is permanent.

POS not showing offline data jobs

In Dynamics 365 the offline story on the MPOS has been significantly improved since AX 2012. Now it’s more or less a click on the register in Dyn365, install the POS and distribute data. A bit simplified approach to life but anyways.

Here’s something I’ve seen a couple of times now. There’s just no jobs and nothing to process …No offline jobs

The event log gives it away a bit:

Failed to get offline sync data in offline database due to Exception. Error Details: Data Source=localhost\SQLEXPRESS;Initial Catalog=RetailOfflineDatabase;Integrated Security=True;Persist Security Info=False;Pooling=True;Encrypt=True;TrustServerCertificate=True

offline error 1

And in the details it’s clear that we’re facing a rights issue here. But in SQL Express with no management tools on the POS machine we don’t have many configuration options and even less when we’re in a setup with a large number of registers.

All you need to do is add the user logged in to Windows to these two groups on the local machine:

User groups

Log off and on again to activate the changes. Start your POS and check the database connection status. Hopefully, you should see a lot of jobs now:

Offline jobs

RDP access to Dyn365 servers denied

You know that the server is running, but your RDP connection request is denied.

Connection waitingConenction denied

This might be due to the introduction of the IP access check introduced not so long ago and on production environment servers. To get around this you need to add your IP address to the white list for the environment.

To do this click Maintain and Enable access:

Click maintain

Select Enable access:

Enable access

Click + to add a new rule:


Fill in the form with a relevant name and the IP address to white list:

Add form

To get your IP you can ask Google or use one of the many web sites offering that service. For example www.whatsmyip.org:


As soon as your have created the new rule you’re good to go (and connect):


RetailTenantUpdateTool.ps1 misses prerequisites

After moving data from one server to another you need to run the RetailTenantUpdateTool powershell script. Doing this might present you with this error:

Please download and install below prerequisites:
Microsoft Online Services Sign-In Assistant for IT Professionals, download link: http://go.microsfot.com/fwlink/?LinkID=286152
Azure Active Directory Module for Windows PowerShell (64-bit version), download link: http://go.microsoft.com/fwlink/p/?linkid=236297

PowerShell error

One of them actually links to a proper download but the other one forwards you to a site telling you that what you’re looking for has retired …

To get moving install this module before executing the script:

Install-Module -Name MSonline

Install MSOnline

After completing this install you can run the script without any issues.

Login prompt


Data import random error – Method not found

Moving data between different instances of Dynamics 365 should be relatively easy with the Data Management framework. Microsoft offers a long list of entities and default templates nicely grouped into relevant packages.

In 7.2 we did experience some issues with loading the default templates, but that seemed fixed in 7.3.2. The real problems showed up when trying to import the packages in projects created based on the export file.

It seemed like after importing file number one or two the Data Management forms got corrupted with an error stating: “Method not found: ‘Dynamics.AX.Application.FormBuildControl Dynamics.AX.Application.FormBuildControl.parentControl()’.”

Method not found

All other forms checked worked. After a full compile we were back on track, but that really wasn’t a working solution with multiple packages to import.

Debugging the forms didn’t add anything valuable to the troubleshooting. Going back to start and breaking it up into smaller chunks it appeared too random to be data driven as first suspected.

The solution – don’t ask how I ended there – is like this:

After each package I cleared the browser cache and refreshed the tab running my Dynamics 365….

Clear browsing data

And here a little pitch of Google Chrome that allows you to only clear the most recent part of the data:

Clear browsing data - last hour

It adds a bit of overhead to the data import having to do this step between each package; but nothing compared to the alternatives.