Changing the way you run Business Central in docker…

The future is here – good bye to images, hello to artifacts!

Freddys blog

Just as you thought you were getting the hang of running Business Central in docker, then you see this title! Your first thought is probably how big the changes are to your pipelines and to your scripts and how much time you will have to perform these changes.

View original post 1,057 more words

Changing the way you run Business Central in docker…

Keeping AL Reports and RDL in sync

I have had some trouble keeping dataset and RDL of my AL reports in sync. This may seem really obvious if you know the trick, but I could not find clear documentation for the process.

When designing AL reports, if you make a change to dataset in report object, such as adding or removing dataitems or columns, you need to make sure your RDL file is in sync. Adding the columns or dataitems to RDL by manually editing XML is a sure-fire recipe for breaking your report.

The hint toward the solution is in the “Creating an RDL Layout Report” article.

When you build your initial report object, you should use Build action in Visual Studio Code (CTRL+SHIFT+B) to automatically generate RDL file from the report.

For this to work you must specify following properties on the report:

    DefaultLayout = RDLC;
    RDLCLayout = 'MyRDLCReport.rdl';

Note that RDLCLayout property is a path within the current project. Your path may actually be something like this:

    RDLCLayout = './src/ReportLayout/MyRDLCReport.rdl';

Once you save your report and Build it, the RDL file will be automatically generated in the appropriate folder.

This leads us to solution: if you make any changes to dataitems/columns in your report file, you should save your report and use Build (CTRL+SHIFT+B), and only then go and edit RDL file using SQL Server Report Builder. Build action will automatically update your RDL file with new dataitems/columns.

Keeping AL Reports and RDL in sync

Partners raising tech support tickets for Dynamics 365 Business Central

You may have noticed that, since mid-March 2019, partners can no longer raise technical support tickets on PartnerSource. If you check the banner link on PartnerSource, you will be sent to this page which advises we should be using – which is not actually the way to go, as it requires credit card or prepaid incidents.

Alternative approach, if you are a partner and member of “Ready To Go!” programme, is to raise the tickets via Collaborate Feedback. However, tickets I personally raised through Collaborate/”Ready to go” program take very long time to get any kind of feedback, and were mostly closed without resolution, with Microsoft leaving message “you should raise tech support ticket”.

Feedback on this from Microsoft is they are in the process (!) of setting up proper support for D365BC. Until this is sorted, the process seems to be as described below. This should result in raising tech support ticket that does not require credit card or subscription – which is what we need when reporting bugs in base product.

To start, log in to Partner Center and access Customers list.

Customer list on Partner Center

Select one of the customers to open Service Management page. Now scroll down to the bottom of the page and select New Request.

Service Management

From the available options pick Office 365/Dynamics 365:

Selecting request type – intuitive UI FTW!

This will open Office support portal, and will by default open the “Need help?” dialog on the right (1). You can also open same dialog by clicking on Support section on the left (2):

Need help? Why, what made you think that?

Type the details of your issue in the box and your support request will be created. This is very important: Make sure you include words “Dynamics 365 Business Central” in the text so it gets routed to appropriate team at Microsoft!

Was this article useful to you? Let me know in the comments!

Partners raising tech support tickets for Dynamics 365 Business Central

Versioning Per-Tenant Extensions using GitHub Releases

I was reminded (via comments on this issue) there is still some confusion about limitations and process of deploying per-tenant extensions to more than one tenant.

Dmitry Katson wrote about this issue back in January, as part of his tips and tricks post, but I would like to share how I have approached this from the process point of view. For this purpose, I have created a simple demo app, which you can find on my GitHub.

Much of the following will be quite obvious if you are experienced with GitHub, but if you are coming from Dynamics NAV world like I did, I hope it will be of some interest.


If we deploy a per-tenant app (e.g. extension built in 50000-99999 object range) to PROD tenant, then we can’t deploy the app with same name and version to any other tenants in the same region, unless Package ID (which is a GUID value) is exactly the same.
Every compile of app from VS Code will create new Package ID – this will happen when using F5 to deploy from VS Code, or building using CTRL+SHIFT+F5.

Not being able to see this Package ID from BC, and as it was too easy to overwrite the APP file recompiling, developers would have often incremented app versions to deal with this. With Spring release we can now use Inspect functionality (CTRL+ALT+F1) on Extensions page to see Package ID, but it is still too easy to overwrite the app file.

Package ID s not visible from BC

We can use Get-NAVAppInfo with -Path switch to read Package ID of the initial compiled app:

Retrieving Package ID from App file

After using CTRL+SHIFT+B to recompile the app:

Building app using CTRL+SHIFT+B

Get-NAVAppInfo is now showing different PackageID:

Build generates new Package ID

Unless we are very careful, and always commit and push the changes after each compile, we have now lost the version of file we had deployed on client’s tenant.

This is, of course, not much of an issue if we only develop our per-tenant app for a single tenant, but if we intend to install the same solution to multiple tenants in the region, we will have to increase version for each one.

Or will we?


We can use Release functionality on GitHub to keep track of our releases, and source code. This allows us to add ZIP file which contains APP file to GitHub, creates a Release, and associates it with current state of source code at the time of release.

Here is the Repository after initialization and 1st commit:


On the main page of repository, I will click on Releases. Because this is the first time I do this in this repository, I will click Create New Release.

Releases – first time

In the page that opens, we will fill in the details as highlighted below, attach zipped APP file, then click “Publish release”. Note that APP file must be zipped, as GitHub does not recognize the .APP format.

Creating new release

After clicking Publish Release, following page will open. Few points of interest:

  1. Here we can see Commit ID. Clicking here will take us to Commits page, and we can see that Release is now linked to our last commit “1st release”.
  2. We can always see when we are looking at latest release
  3. Release always has Source Code (zipped and tar/gzipped) from the related commit attached
Release created

Remember to delete ZIP file from our project folder – as we have uploaded it to GitHub, there is no reason to keep it in the project too.

Remove ZIP file

Final step is to communicate to other developers, consultants (and possibly clients) link to latest release, to use when deploying to PROD, or for testing in local Docker.

What next?

As we continue working on our app, we will now always be able to retrieve this specific release to deploy to client’s tenants.

Once changes in app are ready for PROD deployment, we will:

  1. Build and do final test on our app
  2. Increase version in app.json and turn off ShowMyCode
  3. Create final APP file
  4. Commit and push all of the changes to GitHub
  5. Zip the final APP file
  6. Create new release, and attach zipped file

I hope you find this article useful, and that it saves you at least some of the headaches I had faced dealing with per-tenant extensions.

Versioning Per-Tenant Extensions using GitHub Releases

How to find source of INCONSISTENT error in Dynamics NAV or on-premise Dynamics 365 Business Central

Every few years I run into INCONSISTENT error at one client or another. Usually, these are signs that there has been some “hacky” coding done that messes with standard, and produces unbalanced journal when posting.

Few days ago I ran into same issue, but this time it was combination of long-used add-on (banking solution, in use 10+ years) and localisation functionality (withholding tax) that users only now started using.

As INCONSISTENT error only pops up once posting is complete, this was quite difficult to debug and get to the bottom of.

You could, for example, turn on the debugger, go through the entire process line by line, and take note of G/L Entries that are posted in the process.

Or, you could set debugger breakpoint just before INCONSISTENT error, and use it to get the data you need from SQL Server.

This is exactly what I did. I placed a breakpoint prior to INCONSISTENT error, started debugger, and then posted the journal.

I then hopped over to SQL Server, and wrote a simple query, shown below.  I have used similar solution before to check on status of long-running updates, such as Adjust Cost – Item Entries job which historically has been know to go into infinite loops.

Transaction level Read Uncommitted will let you see all entries in the database, even uncommitted ones. While breakpoint holds the code open, you can grab the data as it would be when posted and analyze it to your heart’s content.


  • Never use this in Production systems. The table in question is locked while you are playing on the SQL server, and no posting will be possible!
  • In most cases, Posting Preview function will give you same results, without the risk of locking. Unfortunately, this was on version of NAV before Preview Posting was introduced.
SELECT TOP (1000) [timestamp]
      ,[Entry No_]
      ,[G_L Account No_]
      ,[Posting Date]
      ,[Document Type]
      ,[Document No_]
      ,[Bal_ Account No_]
      ,[Global Dimension 1 Code]
      ,[Global Dimension 2 Code]
      ,[User ID]
      ,[Source Code]
      ,[System-Created Entry]      
  FROM [DatabaseName].[dbo].[CompanyName$G_L Entry]
  ORDER BY [Entry No_] DESC

How to find source of INCONSISTENT error in Dynamics NAV or on-premise Dynamics 365 Business Central

Text fields on Dynamics NAV 2016 cues

Been a long time since last post, not because I was lazy but because there was too much to do 🙂

I had some fun trying to get Text FlowField to show on NAV 2016 Activity page.

Per NAV documentation, Cues are supposed to support Integer, Decimal and Text fields as of NAV 2015. Funny thing is, there are no examples of Text fields on standard NAV cues in 2016. Not a single one.

So when I tried to add one, I got a very useless stack image with count of 0 and text field title.

Fast forward an hour or so of trying various things.

The trick is to add the Text field OUTSIDE of CueGroup. It needs to be at the same level as CueGroup, otherwise NAV assumes it needs to show as Stack which is a bit useless.

Text fields on Dynamics NAV 2016 cues


NB: This is a rewritten version of earlier article.

You could write:

Record.SETFILTER(FieldName, ”””);

You should write:

Record.SETFILTER(FieldName, ‘%1’,”);

Or even better:

Record.SETRANGE(FieldName, ”);

The above is what I wrote before I went on leave.

Funnily enough, after getting back from my leave I ran into an issue that directly relates and which made me rewrite this blog post.

SETFILTER will, in most cases, be able to replicate SETRANGE functionality.

However, is there are reserved characters within the filter text these will be interpreted differently if you use:

SETFILTER(FieldName, txtFilter);


SETFILTER(FieldName, ‘%1’, txtFilter);

In first case, txtFilter will be parsed as filter text. This will cause runtime errors if txtFilter contains something like “(” character.

In second case filter will be applied without parsing it. This replicates SETRANGE functionality, or using filter ‘txtFilter’ in filtering dialog in user interface.


Viewing NAV code from RTC

Remember the last post? I needed a way to easily view and edit large chunks of text, and used Memo BLOBs for that.

What I was actually after was a way to display NAV objects in text format in RTC.
While the Memo trick worked up to a point, it didn’t play nicely with the long lines of code found in NAV objects, and multi-line textbox doesn’t let you scroll horizontally.

While trying to come up with a way to solve this, I had to do some debugging for something unrelated.

If you ever used debugger in NAV 2013 or later, I am sure you know where this is going…

Enter: Code Viewer control

I looked at the way Code Viewer control was used in standard NAV debugger. For reference, take a look at pages 9500 and 9504 in NAV 2015.

I noticed that what the control was doing was merely displaying large chunks of text which were streamed from BLOB object. I played with it for a bit and came up with this:

For this recipe, you will need:
– 1 table with BLOB field of type Memo (to hold the object in text format)
– 1 page with Code Viewer control – set Control Add-In Type = Microsoft.Dynamics.Nav.Client.CodeViewer;PublicKeyToken=31bf3856ad364e35
This code snippet:

CodeStream@1007 : InStream;
BreakpointCollection@1006 : DotNet "'Microsoft.Dynamics.Nav.Client.CodeViewerTypes, Version=, Culture=neutral, PublicKeyToken=31bf3856ad364e35'.Microsoft.Dynamics.Nav.Client.CodeViewerTypes.BreakpointCollection";
Code@1005 : BigText;
LineNo@1000 : Integer;
CodeText@1001 : Text;

  CALCFIELDS("BLOB Reference");
  "BLOB Reference".CREATEINSTREAM(CodeStream);
  Code.ADDTEXT('Nothing to see, move along');
LineNo := 1;

Now this gives us nice & shiny, user-friendly display of the code.It even highlights the keywords in code, making it easier to read. It is not, unfortunately, editable – that would have been a nice coup 🙂

Having done this, I had a thought – where exactly does NAV debugger get NAV code from when it needs to display it?
The answer was pretty simple, actually – there is a BLOB field in Object Metadata table called “User AL Code”.

Just exchange the above code with this, where Type and ID are fields from Object table:

IF ObjectMetadata.GET(Type,ID) THEN BEGIN
  ObjectMetadata.CALCFIELDS("User AL Code");
  ObjectMetadata."User AL Code".CREATEINSTREAM(CodeStream);
  LineNo := 1;

You will get this nice view of NAV code. There are two added benefits:

  1. you don’t actually have to have the object in question within your license to see the code: this helps nicely when you need a quick glance at such objects
  2. You can see what code is executed if you are unsure if old version of code may still be held by the server


And now for added bonus – you can use field “User Code” instead of “User AL Code” in the last snippet to see C# code:


Viewing NAV code from RTC

Local and Global variables with same name – a cautionary tale

There is a function GetReport in Codeunits 82 & 92 (it is still there in NAV 2015, and has been there at least from NAV 2009) and it decides which report gets printed. You would think it is intended to be available from anywhere in NAV, but you would be wrong.

The catch is in variable SalesHeader/PurchHeader, where same name was used both for parameter of the function, as well as for a Global variable.

When function is called from CU82/92, it works fine (as it references the Global variable).

Try calling it from elsewhere, and you get <Uninitialized> on the variable in All pane. Try looking in Locals, and there it is again, but this time it contains data.

Unfortunately, the Global variable takes precedence and the parameter gets ignored….

Local and Global variables with same name – a cautionary tale