Microsoft Flow for AX7 integration – Part 2 – importing data from a local file

In Part 1 of the series, I shared an example that shows how Microsoft Flow works in general and how it connects to Microsoft Dynamics 365 for Operations (also known as AX7). This part tries to explain the usage of the on-premises data gateway, to work with files on the local computer or a local network share. It also shows how to loop though the source data and run Microsoft Flow actions for every data record.

Scenario

  1. Whenever a JSON file appears in a source folder (trigger),
  2. Read the content of the file (action),
  3. Loop through all the data records to find customer records (loop),
  4. For every customer record, call AX via OData to create the customer in AX (action),
  5. After going through all the records, move the source file into an archive folder (action).

On-premises data gateway installation and configuration

Moving AX to the cloud meant that the previous file-based scenarios (DIXF, custom-coded, or using ISV solutions) no longer work, because the AOS that runs in Azure cannot read or write files on the local network. This has been a big concern, as for many systems, file-based integration is the only available approach. To get files processed and the data passed to AX via OData calls, custom services, or the file API, one needs to have an AX-capable ESB solution or a custom piece of software. Microsoft have published some C# code examples in GitHub, under https://github.com/Microsoft/Dynamics-AX-Integration that solves this problem but this is little more than sample code.

Microsoft Flow can provide an alternative way to get the local data to the cloud and back. It has a service called “on-premise data gateway”.

https://flow.microsoft.com/en-us/documentation/gateway-reference/

In short, this is a Windows service that runs on a local machine (probably a server) and connects to Microsoft Flow using Azure Service Bus. The connection is outbound-only so there is no need to open any inbound ports. The gateway can handle files from the local file system, SQL queries against a local SQL Server, and more.

Therefore, before proceeding with the example, download and install the service on a local computer, according to the instructions on the above link (there is a download link  within the instructions).

Next, create a folder structure on your local machine to hold the files to exchange. I have created the following structure for my examples:

Folder structure

You need to create a separate folder for every input file type, as Microsoft Flow cannot search for files by type or wildcard name.

After the gateway has been installed, it’s time to go back to flow and define a service that connects to the gateway.

  1. Click the gear icon next to your username on top right and choose Connections.
  2. Click “Create connection” at the top.
  3. From the list of available connections, choose File SystemMicrosoft Flow
  4. The parameters form opens for the file system connection. Choose the proper values.
    File system gateway setup
  5. Click on Create connection to complete the process. The local folder can now be used in your flows.

Defining the flow

First of all, you need an example input file to define the structure. I am using the following file with customer account numbers and names in JSON format:

{"NewCustomers": [
	{"Id": "FT2", "Name": "Flow Test 2"},
	{"Id": "FT3", "Name": "Flow Test 3"}
]}

 

  1. Create a new flow
  2. For the trigger, choose “File System – When one or more files are created”
    Select the folder that the trigger will be monitoringFile create trigger
  3. First action: “File System – Get file content” to load the file into Flow as an object
    Use dynamic field “Path” from the trigger to specify the file to readimage
  4. Next step: “Data Operations – Parse JSON” to convert the file content into a JSON object for processing.
    Specify the file content from the previous step as the content to parse.
    For the JSON schema, the easiest is to click on “Use sample payload” and paste the example JSON to generate the schema.image
    image
  5. As the next step, add a loop, “Add an apply to each”, and specify the array of customers as the input.image
    image
  6. Add an action within the loop, “Dynamics 365 for Operations – Create record”
    Use dynamic values for the 2 fields from the input file and fixed values for the rest.image
  7. Add the next step OUTSIDE the loop to move the input file after processing.
    As there is no move action available, we need 2 steps: Copy and Delete.imageNote: the destination file path in the copy action must be specified as relative to the connection’s main folder. This is different from other actions and parameters and will hopefully be changed by Microsoft in the future, to be more consistent.
  8. All done – save the flow. It will look like this:image
  9. Test the flow by creating a file in the input folder. Then go back to the list of flows and click on the imageicon next to the flow to view its history.

 Notes:

  • At the time of writing this, the “Parse JSON” action does not work properly if the input file has a .json extension – looks like the recognition of the file content type is based on the extension. Use a .txt extension, for example “newcustomers.json.txt”
  • Under the “free” license, the files are polled every 5 minutes and sometimes it takes 2 polls (10 minutes) to recognise a new file. Also, you need to create the file after saving the flow, the previous files in the folder will not be recognised.

Looking at the history:

image

Looking at the file folders, I can see that the file has been moved to the archive.

Looking into AX:

image

Conclusion

Although the file system service would still need some improvements, it works well as a proof of concept.

Suggestions to Microsoft:

  • Recognise file content regardless of the file extension (.json for example does not produce a text value)
  • Parse other file formats, such as XML or CSV. I have not found a solution for this yet.
  • Provide more details how the new file recognition works and what the differences are between the two triggers, “files are created” and “files are added or modified”. Sometimes I have seen new files being skipped by the gateway.

Final note: a similar scenario could be built with files on an SFTP server or in a cloud storage, e.g. OneDrive, without using the on-premises data gateway – corresponding actions are available in Flow. The file storage location will depend on the implementation requirements.

Leave a Reply

Your email address will not be published. Required fields are marked *