C/AL to AL Data Upgrade Automation Powershell Script

Eric Walters (Waldo) shared a great tool developed “on top of” the existing Microsoft.Dynamics.Nav.Model.Tools.dll, which is a library that serves the PowerShell Merge-CmdLets provided by Microsoft, that allow us to do a bunch of analysis on top of NAV/BC objects (man, you rock!).

More info here: https://www.waldo.be/2019/02/15/c-al-source-code-analysis-with-powershell/

And here: https://github.com/waldo1001/Waldo.Model.Tools

I have created a script that uses this great tool and automates part of the objects creation needed in the data migration process.

The script is available in GitHub and a pull request will be made. Hopefully Waldo will accept it and include it in the original repo.

https://github.com/ricardopaiva/Waldo.Model.Tools/tree/feature/data_migration_demo

Branch: feature/data_migration_demo

Added files: DataMigrationModule.psm1 and DataMigrationScript.ps1 in Data Migration folder.

UPDATE 2020/03/29: I’ve made a few updates in the script. Table filtering is now possible as well. Waldo accepted and merged both pull requests so check the most recent version in his GitHub page: https://github.com/waldo1001/Waldo.Model.Tools

To give you context, I’ve made a data upgrade from C/AL to AL (NAV 2009 to Business Central v14). If you want to know how I handled the data migration from C/AL tables to AL table extensions, please take a look at the following article in my blog: https://ricardomoinhos.com/c-side-to-al-table-data-migration/

The script I’ve created automates the following steps:

  • Creation of the temp tables (used to migrate data in custom fields, from the actual tables to the temp tables);
  • Creation of the C/AL data migration codeunit (used to migrate data in custom fields, from the actual tables to the temp tables created earlier);
  • Creation of the AL data migration codeunit (used to migrate in from custom fields, from temp tables to the final fields in table extensions);
  • Also, there’s a function available to create a CSV with all objects/fields processed, just in case you need to, for example, list all custom fields.

How to use

  1. Make sure that the Waldo’s tool is installed and working;
  2. Edit _LoadObjects.ps1 and update the objects file path in the first line;
$ObjectsFile = "Objects Folder\Objects.txt" #point to a file that exists for you (advised to create a complete text-export of an BC/NAV Database)
$ModuleToolAPIPath = Join-Path $PSScriptRoot "The Magic\NavModelToolsAPI.dll"
import-module $ModuleToolAPIPath -WarningAction SilentlyContinue -Verbose
$Model = Get-NAVObjectModel -NavObjectsTextFile $ObjectsFile -TimeExecution
Write-Host $Model.NAVObjects.Count 
  1. Run the _LoadObjects.ps1 script to load the objects model in memory before running the DataMigrationScript.ps1. This script file is part of Waldo’s tool and you can run it or you can copy the content to your own script file (you can copy the content of this file to the DataMigrationScript.ps1 if you want)
  2. Update and run the DataMigrationScript.ps1 file, according to your needs;
$DataMigrationModulePath = Join-Path $PSScriptRoot "DataMigrationModule.psm1"
Import-Module $DataMigrationModulePath -WarningAction SilentlyContinue -Verbose -Force
if ($Model.NAVObjects.Count -eq 0) {
   Write-Host -ForegroundColor Red "No objects were found. Load objects first using _LoadObjects.ps1 script"
   return
}

CreateCALFieldsCSV 'C:\Temp\ObjectsAnalysis.csv' `
                   -fieldFilter '50000..90000|31000000..31099999'
CreateCALTempTables 'C:\Temp\CALTempTables.txt' `
                    -fieldFilter '50000..90000|31000000..31099999'
CreateCALTempDataUpgradeCodeunit 'C:\Temp\TempDataMigrationCodeunit.txt' `
                              -fieldFilter '50000..90000|31000000..31099999'
CreateALDataUpgradeCodeunit 'C:\Temp\ALDataUpgradeCodeunit.al' `
                            -fieldFilter '50000..90000|31000000..31099999' 

This is the demo script file to exemplify how to use the functions I’ve created. In this example I’m filtering fields range 50000 to 90000 (custom fields – just for demo purposes because I’ve got none in the objects used) and 31000000 to 31099999 (the portuguese localization fields range).

Output

For demo purposes I’ve exported six objects as text and joined them in a single file.
In VSCode I’ve opened and run the _LoadObjects.ps1 and the DataMigrationScript.ps1 files.

I’ve used the same output path on every function so the script will create the following:

  • Creation of the csv file with all tables and fields processed
    • A single csv file is created with all the objects and fields processed
  • Creation of the temp tables:
    • A single text file is created with all each table processed.
    • In the example I only want to export data from custom files so I’ve entered a field range filter so the text file only have the table key fields and the custom fields within the range.
  • Creation of the C/AL data migration codeunit
    • A single text file is created with the data migration codeunit, refering all the tables processed.
  • Creation of the AL data migration codeunit
    • A single .al file is created with the data migration codeunit

Output files

The Objects.txt file contains the following objects:

  • Table 3 Payment Terms
  • Table 4 Currency
  • Table 5 Finance Charge Terms
  • Table 8 Language
  • Table 15 G/L Account
  • Table 37 Sales Line

The files generated in the output path are the following:

  • Function CreateCALFieldsCSV – File output: ObjectsAnalysis.csv
 Object Type,Object Id,Object Name,Field Id,Field Name,Field Type
   Table,3,Payment Terms,31022914,VAT Distribution,Option
   Table,4,Currency,31022890,Decimal Currency Text,Text30
   Table,4,Currency,31022891,Decimal places decimal Curr.,Integer
   Table,4,Currency,31022913,Bill Groups - Collection,Boolean
   Table,4,Currency,31022914,Bill Groups - Discount,Boolean
   Table,4,Currency,31022915,Payment Orders,Boolean
   Table,5,Finance Charge Terms,31022890,Sign on Issuing,Boolean
   Table,15,G/L Account,31022890,Income Stmt. Bal. Acc.,Code20
   Table,15,G/L Account,31022891,Ignore Discounts,Boolean
   Table,15,G/L Account,31022892,Associate DRF Code,Boolean
   Table,15,G/L Account,31022895,Cash-Flow Code,Code10
   Table,15,G/L Account,31022896,Cash-flow Code Assoc.,Boolean
   Table,15,G/L Account,31022897,DRF Code,Code10
   Table,15,G/L Account,31022970,Transfer Account Type,Option
   Table,15,G/L Account,31022971,Transfer Account No.,Code20
   Table,15,G/L Account,31022975,Taxonomy Code,Integer
   Table,37,Sales Line,31022890,ND %,Decimal
   Table,37,Sales Line,31022891,ND Difference,Decimal
   Table,37,Sales Line,31022892,Not in Vat Report,Boolean
   Table,37,Sales Line,31022893,Pmt. Disc. Given Amount,Decimal
   Table,37,Sales Line,31022896,Stamp Duty Code,Code10
   Table,37,Sales Line,31022897,DRF Code,Code10
   Table,37,Sales Line,31022898,Credit-to Doc. No.,Code20
   Table,37,Sales Line,31022899,Credit-to Doc. Line No.,Integer
   Table,37,Sales Line,31022970,Withholding Tax Code,Code20
   Table,37,Sales Line,31022971,Withholding Tax %,Decimal
   Table,37,Sales Line,31022972,Withholding Tax Account,Code20
   Table,37,Sales Line,31022973,Withholding Tax Amount,Decimal
   Table,37,Sales Line,31022974,Orig. Withholding Tax Amount,Decimal
   Table,37,Sales Line,31022975,Max. Correction Amount,Decimal
   Table,37,Sales Line,31022976,Prepayment ND %,Decimal 
  • Function CreateCALTempTables – File output: CALTempTables.txt
 OBJECT Table 50000 Temp Payment Terms
 {
   FIELDS
   {
     { 1;;Code;Code10 }
     { 31022914;;VAT Distribution;Option; OptionString=First Installment,Last Installment,Proportional }
   }
   KEYS
   {
     {    ;Code; }
   }
 }
 OBJECT Table 50001 Temp Finance Charge Terms
 {
   FIELDS
   {
     { 1;;Code;Code10 }
     { 31022890;;Sign on Issuing;Boolean }
   }
   KEYS
   {
     {    ;Code; }
   }
 }
 OBJECT Table 50002 Temp Currency
 {
   FIELDS
   {
     { 1;;Code;Code10 }
     { 31022890;;Decimal Currency Text;Text30 }
     { 31022891;;Decimal places decimal Curr.;Integer }
     { 31022913;;Bill Groups - Collection;Boolean }
     { 31022914;;Bill Groups - Discount;Boolean }
     { 31022915;;Payment Orders;Boolean }
   }
   KEYS
   {
     {    ;Code; }
   }
 }
 OBJECT Table 50003 Temp G/L Account
 {
   FIELDS
   {
     { 1;;No.;Code20 }
     { 31022890;;Income Stmt. Bal. Acc.;Code20 }
     { 31022891;;Ignore Discounts;Boolean }
     { 31022892;;Associate DRF Code;Boolean }
     { 31022895;;Cash-Flow Code;Code10 }
     { 31022896;;Cash-flow Code Assoc.;Boolean }
     { 31022897;;DRF Code;Code10 }
     { 31022970;;Transfer Account Type;Option; OptionString= ,G/L Account,Customer,Vendor }
     { 31022971;;Transfer Account No.;Code20 }
     { 31022975;;Taxonomy Code;Integer }
   }
   KEYS
   {
     {    ;No.; }
   }
 }
 OBJECT Table 50004 Temp Sales Line
 {
   FIELDS
   {
     { 1;;Document Type;Option; OptionString=Quote,Order,Invoice,Credit Memo,Blanket Order,Return Order }
     { 3;;Document No.;Code20 }
     { 4;;Line No.;Integer }
     { 31022890;;ND %;Decimal }
     { 31022891;;ND Difference;Decimal }
     { 31022892;;Not in Vat Report;Boolean }
     { 31022893;;Pmt. Disc. Given Amount;Decimal }
     { 31022896;;Stamp Duty Code;Code10 }
     { 31022897;;DRF Code;Code10 }
     { 31022898;;Credit-to Doc. No.;Code20 }
     { 31022899;;Credit-to Doc. Line No.;Integer }
     { 31022970;;Withholding Tax Code;Code20 }
     { 31022971;;Withholding Tax %;Decimal }
     { 31022972;;Withholding Tax Account;Code20 }
     { 31022973;;Withholding Tax Amount;Decimal }
     { 31022974;;Orig. Withholding Tax Amount;Decimal }
     { 31022975;;Max. Correction Amount;Decimal }
     { 31022976;;Prepayment ND %;Decimal }
   }
   KEYS
   {
     {    ;Document Type,Document No.,Line No.; }
   }
 }
  • Function CreateCALTempDataUpgradeCodeunit – File output: TempDataMigrationCodeunit.txt
OBJECT Codeunit 50000 To Temp Tables Data Upgrade
 {
   PROPERTIES
   {
     Subtype=Upgrade;
   }
   CODE
   {
     VAR
       DataUpgradeMgt@1000 : Codeunit 9900;

    [TableSyncSetup]
    PROCEDURE GetTableSyncSetupCustom@50000(VAR TableSynchSetup@1000 : Record 2000000135);
    BEGIN
      DataUpgradeMgt.SetTableSyncSetup(DATABASE::"Payment Terms",DATABASE::"Temp Payment Terms",TableSynchSetup.Mode::Copy);
      DataUpgradeMgt.SetTableSyncSetup(DATABASE::"Finance Charge Terms",DATABASE::"Temp Finance Charge Terms",TableSynchSetup.Mode::Copy);
      DataUpgradeMgt.SetTableSyncSetup(DATABASE::"Currency",DATABASE::"Temp Currency",TableSynchSetup.Mode::Copy);
      DataUpgradeMgt.SetTableSyncSetup(DATABASE::"G/L Account",DATABASE::"Temp G/L Account",TableSynchSetup.Mode::Copy);
      DataUpgradeMgt.SetTableSyncSetup(DATABASE::"Sales Line",DATABASE::"Temp Sales Line",TableSynchSetup.Mode::Copy);
    END;

    BEGIN
    END.
  }
}
  • Function CreateALDataUpgradeCodeunit – File output: ALDataUpgradeCodeunit.al
codeunit 50100 "Data Upgrade Codeunit"
 {
   Subtype = Upgrade;
 trigger OnRun()
   begin
   end;
 local procedure Payment_Terms()
   var
     DestinyTable: Record "Payment Terms";
     OriginTable: Record "Temp Payment Terms";
   begin
     With OriginTable do begin
       if FindSet() then
       repeat
         DestinyTable.Init();
         DestinyTable."VAT Distribution" := OriginTable."VAT Distribution";
         DestinyTable.Insert;
       until next = 0;
       //DeleteAll();
     end;
   end;
 local procedure Finance_Charge_Terms()
   var
     DestinyTable: Record "Finance Charge Terms";
     OriginTable: Record "Temp Finance Charge Terms";
   begin
     With OriginTable do begin
       if FindSet() then
       repeat
         DestinyTable.Init();
         DestinyTable."Sign on Issuing" := OriginTable."Sign on Issuing";
         DestinyTable.Insert;
       until next = 0;
       //DeleteAll();
     end;
   end;
 local procedure Currency()
   var
     DestinyTable: Record "Currency";
     OriginTable: Record "Temp Currency";
   begin
     With OriginTable do begin
       if FindSet() then
       repeat
         DestinyTable.Init();
         DestinyTable."Decimal Currency Text" := OriginTable."Decimal Currency Text";
         DestinyTable."Decimal places decimal Curr." := OriginTable."Decimal places decimal Curr.";
         DestinyTable."Bill Groups - Collection" := OriginTable."Bill Groups - Collection";
         DestinyTable."Bill Groups - Discount" := OriginTable."Bill Groups - Discount";
         DestinyTable."Payment Orders" := OriginTable."Payment Orders";
         DestinyTable.Insert;
       until next = 0;
       //DeleteAll();
     end;
   end;
 local procedure G/L_Account()
   var
     DestinyTable: Record "G/L Account";
     OriginTable: Record "Temp G/L Account";
   begin
     With OriginTable do begin
       if FindSet() then
       repeat
         DestinyTable.Init();
         DestinyTable."Income Stmt. Bal. Acc." := OriginTable."Income Stmt. Bal. Acc.";
         DestinyTable."Ignore Discounts" := OriginTable."Ignore Discounts";
         DestinyTable."Associate DRF Code" := OriginTable."Associate DRF Code";
         DestinyTable."Cash-Flow Code" := OriginTable."Cash-Flow Code";
         DestinyTable."Cash-flow Code Assoc." := OriginTable."Cash-flow Code Assoc.";
         DestinyTable."DRF Code" := OriginTable."DRF Code";
         DestinyTable."Transfer Account Type" := OriginTable."Transfer Account Type";
         DestinyTable."Transfer Account No." := OriginTable."Transfer Account No.";
         DestinyTable."Taxonomy Code" := OriginTable."Taxonomy Code";
         DestinyTable.Insert;
       until next = 0;
       //DeleteAll();
     end;
   end;
 local procedure Sales_Line()
   var
     DestinyTable: Record "Sales Line";
     OriginTable: Record "Temp Sales Line";
   begin
     With OriginTable do begin
       if FindSet() then
       repeat
         DestinyTable.Init();
         DestinyTable."ND %" := OriginTable."ND %";
         DestinyTable."ND Difference" := OriginTable."ND Difference";
         DestinyTable."Not in Vat Report" := OriginTable."Not in Vat Report";
         DestinyTable."Pmt. Disc. Given Amount" := OriginTable."Pmt. Disc. Given Amount";
         DestinyTable."Stamp Duty Code" := OriginTable."Stamp Duty Code";
         DestinyTable."DRF Code" := OriginTable."DRF Code";
         DestinyTable."Credit-to Doc. No." := OriginTable."Credit-to Doc. No.";
         DestinyTable."Credit-to Doc. Line No." := OriginTable."Credit-to Doc. Line No.";
         DestinyTable."Withholding Tax Code" := OriginTable."Withholding Tax Code";
         DestinyTable."Withholding Tax %" := OriginTable."Withholding Tax %";
         DestinyTable."Withholding Tax Account" := OriginTable."Withholding Tax Account";
         DestinyTable."Withholding Tax Amount" := OriginTable."Withholding Tax Amount";
         DestinyTable."Orig. Withholding Tax Amount" := OriginTable."Orig. Withholding Tax Amount";
         DestinyTable."Max. Correction Amount" := OriginTable."Max. Correction Amount";
         DestinyTable."Prepayment ND %" := OriginTable."Prepayment ND %";
         DestinyTable.Insert;
       until next = 0;
       //DeleteAll();
     end;
   end;
 trigger OnUpgradePerCompany()
   begin
     Payment_Terms;
     Finance_Charge_Terms;
     Currency;
     G/L_Account;
     Sales_Line;
   end;
 }

Considerations

  • Actually the filtering only works for fields. For example, if you apply the filter ‘50000..59999’ and you have a custom table with id 50000 and field no. 1, 2, 3, etc., this script won’t process it because the fields are not within the range.
  • Adding a parameter to filter tables instead it straight forward but I didn’t do that in this version;
  • UPDATE 2020/03/29: It is now possible to filter tables and/or fields;
  • Processing a NAV 2017 full objects file took around 20 minutes (5 minutes for each function). This could be optimized if I combine all functionalities in the same function so that tables and fields are only enumerated once.

Final Words

I hope you find this script useful. Please leave your comments and let me know if there’s something that can be improved. If you find any error, please let me know as well.

Share this

3 Comments

Deixe um comentário

O seu endereço de email não será publicado. Campos obrigatórios marcados com *

Este site utiliza o Akismet para reduzir spam. Fica a saber como são processados os dados dos comentários.