In this post, we will review the integration between D365F&O and Azure data lake.
We will see how to configure and enable the Data lake feature in F&O and how to activate the sync for tables from the UI.
then we will see how the data is stored in Data Lake gen 2 storage and how to access this data from Microsoft Power BI
Note: at this moment Data Lake feature is in preview in certain regions and only available in Tier-2 or higher sandbox environments
Azure Data Lake Storage Gen2 is a set of capabilities dedicated to big data analytics, built on Azure Blob Storage.
Data lakes provide cloud storage that is less expensive than the cloud storage that relational databases provide. Therefore, large amounts of data can be stored in the cloud.
This data includes both business data that is traditionally stored in business systems and data warehouses, device and sensor data, such as signals from devices. In addition, Data Lake supports a range of tools and programming languages that enable large amounts of data to be reported on, queried, and transformed.
Data is stored in Data Lake to comply with the Common Data Model folder standard with the addition of a hierarchical namespace to Blob storage.
The hierarchical namespace organizes objects/files into a hierarchy of directories for efficient data access. A common object store naming convention uses slashes in the name to mimic a hierarchical directory structure.
The following illustration shows the Common Data Model folder structure from Finance and Operations apps
Integration with Azure Data Lake for D365F&O is delivered in the form of an add-in installable from LCS power platform integration.
The Export to Azure Data Lake feature is based on CDC and a microservice that exports Finance and Operations app data to Azure Data Lake and keeps the data fresh
Currently only available in cloud-based Tier-2 or higher sandbox environments
See more details below:
Before install the feature we need to configure a set of Azure resources required when we install the add-in in LCS.
below are the required components that we need to setup.
I will not review this steps in detail as they are well explained in the below link.
follow the steps described in the link above to configure the required Azure resources to allow us install the feature.
In order to install the feature Add-In, we need to enable Power Platform integration for our F&O environment in LCS.
in Tier-2 or higher sandbox environment, navigate to the environment page in LCS and locate the power platform integration section.
Use the Setup option to start the process to deploy a power apps environment.
this process can take up to one hour to complete
Once the process is completed we will see the below
As mentioned previously, the Data Lake feature is installed in the form of an add-in.
Select the option Install a new add-in
Among the different add-ins available we will find Export to Data Lake, select this option
the screen below will pop up requesting the required details that we will have after performing all the steps in the pre-requisites
Fill in the required details and install
If we added all the required configurations we should see the pop-up below
Once the add-in is installed we will see it in the installed add-ins section
Enable the feature in D365F&O, follow instructions below
If you can’t find the feature to enable in the features list or see the option in the menu, try directly accessing the URL of your environment adding the below:
In this form we can select tables to publish in data lake directly by table or tables part of an entity.
Select the table to publish and use the Activate option.
Entities will change status to Initializing and then Running or Error.
Once the entities are in running status we can check in the storage account,
we will see the folder structure for common data model
and the CSV files containing the data
Once the data is published we can leverage the multiple tools to explore and analyze the data from power user tools like Power BI or Power apps, to azure components like Synapse, Data bricks and others
Below we will see the steps to access this data from MS Power BI
We can build Power BI reports directly consuming the data from Data Lake.
From Power BI desktop, we can select as data source Azure Data Lake Storage Gen 2
Connect using URL as below:
Connection can be stablished using credentials or Azure storage key.
Once connected we will see available data depending on the sub folder informed
Once we have added the data source it will be saved as query, we can edit the query and expand the binary content to get the data.
We can then apply certain filters, add column names and other query modifications
use Close & Apply option to apply your changes in query