Sign Up for Training |
insightsoftware Company Site
Submit a Request
Become a Jet Insider
Give Feedback

Setting up Dynamics 365 Export to Data Lake


Setting up Export to Data Lake in Dynamics 365 F&SCM consists of the following steps:

The References section contains references to the Microsoft documentation on the subjects in this procedure.

Configure the Azure Resources

In this procedure, you will create an application, an ADSL2 storage account, and a key vault, and configure their settings. 

  1. Open Azure Active Directory.
  2. If a Service Principal for Microsoft Dynamics ERP Microservices (0cdb527f-a8d1-4bf8-9436-b352c68682b2) is not installed yet, create one using the Microsoft documentation.
  3. Set up an application:
    1. Create an application.
    2. Copy the Application ID and save it for later.
    3. Select API Permissions
    4. Add an API Permission for Azure Key Vault
      • Type: Delegated
      • Permission: user_impersonation
    5. Select Certificates and secrets.
    6. Add a new client secret, and save it for later.
  4. Create an ADSL2 Storage account. Ensure that Enable hierarchical namespace is selected on the Advanced tab.
  5. Grant the Azure Active Directory Application IAM Permissions to the Storage Account that was just created:
    1. Select Access Control (IAM).
    2. Grant the application that was created in step 3 Storage Blob Data Contributor and Storage Blob Data Reader permissions.
  6. Set up a Key Vault:
    1. Create a Key Vault.
    2. Select Overview. 
    3. Find the Vault URI and save it for later.
    4. Create the following secrets:
      • app-id: the id of the application created in step 3.1.
      • app-secret: the client secret that was created for the app in step 3.6.
      • storage-account-name: the name of the storage account created in step 4.
  7. Add an Access Policy to the Key Vault that allows the Microsoft Dynamics ERP Microservices service principal to at least the following Secret Management Operations:
    • Get
    • List

Install the Export to Data Lake add-in

In this procedure, you will install the Data Lake add-in into the Microsoft 365 Dynamics environment, and configure it to export to the ADSL2 storage account. 

  1. Open the Microsoft Dynamics 365 environment:
    1. Log into Lifecycle Services.
    2. Select the Microsoft Dynamics 365 environment.
    3. Click Full Details in the right-hand pane.
  2. Set up the Power Platform integration, if not done already.
  3. Install the Export to Data Lake add-in:
    1. Click Install a new add-in.
    2. Select Export to Data Lake. The Setup add-in dialog box opens.
    3. Select the following configuration values for the add-in:
      1. Tenant ID: your Azure Active Directory tenant ID
      2. Key Vault DNS: The Vault URI of the Key Vault (see step 6 of the previous section)
      3. Storage account secret: your storage-account-name
      4. Application ID secret: your app-id
      5. Application Secret secret: your app-secret
    4. Select Enable near real-time data. This is a prerequisite for integrating with Synapse in the next section.
    5. Click Install. The add-in is installed with your selected options.
  4. Verify that the add-in was configured successfully:
    1. Log into the Microsoft Dynamics 365 environment. 
    2. Search for Configure data feeds.
    3. Open the Export to Data Lake page.
    4. Find and select a sample table (e.g. CustTable) and click Activate. Once the Status column displays Running, the storage account should contain a folder for the corresponding table under:

Integrate with Synapse

In this procedure, you will deploy an Azure Function App to sync the data from the data lake into Synapse.  

  1. Create a Synapse workspace if one does not already exist. 
  2. If the Synapse workspace already existed, explicitly assign the Synapse managed identity the Storage Blob Data Contributor role to the storage account.
  3. Set up the Azure Function App:
    1. Grant the Synapse workspace the Storage Blob Data Reader role to the storage account. 
    2. Clone the Microsoft Dynamics 365 FastTrack Implementation Assets GitHub repository in Visual Studio.
    3. Open the CommonDataModel solution from the root of the repo in Analytics\CDMUtilSolution\Microsoft.CommonDataModel.sln
    4. Add a NuGet Package Source in Visual Studio:
      1. Go to Tools > NuGet Package Manager > Package Manager Settings.
      2. Select Package Sources.
      3. Add the following source:
    5. Install .NET Core 3.1 Runtime.
    6. Rebuild the solution and verify that no build errors were reported.
  4. Publish the Azure Function App:
    1. Right-click on Clients > CDMUtil_AzureFunctions and click Publish.
    2. Select Azure as the Target and Azure Function App (Windows) as the Specific Target.
    3. Select the appropriate subscription.
    4. Click the green to the top right of the Function Apps dialog box to create a new Azure Function App.
    5. Click Publish.
    6. Select the same region and Storage Account that were used in previous steps.
    7. Click Create
    8. Click Finish
  5. Configure the Azure Function App:
    1. Enable the System-assigned managed identity to the Function App:
      1. Select Identity.
      2. Set the status to On.
    2. Configure the Application Settings of the Function App:
      1. Select Configuration.
      2. Add the following configuration values:
        • TenantId: The Azure Active Directory Tenant ID
        • SQLEndpoint: The Synapse Serverless SQL Pool connection string
        • Schema: cdc
        • DDLType: SynapseView (can also be omitted, as this is the default)
        • ManifestURL: https://<container name><environment>
        • DefaultStringLength4000
    3. Grant the Function App IAM Permissions to the storage account:
      1. Select Access Control (IAM).
      2. Grant the Function App the following permissions:
        • Storage Blob Data Contributor
        • Storage Blob Data Reader
    4. Add a new Event Subscription in the Storage Account with the following details:
      • Name: CdmIsJsonCreated
      • Event Schema: Event Grid Schema
      • System Topic: <container name>-system-topic
      • Event Type: Blob Created
      • Endpoint Type: Azure Function
      • Endpoint: EventGrid_CDMToSynapseView
    5. Update the filters:
      • Subject begins with: /blobServices/default/containers/dynamics365-financeandoperations/blobs/
      • Subject ends with: .cdm.json
      • Advanced filters:
        • data.url does not end with .manifest.cdm.json
        • data.url does not contain /resolved/
        • data.url does not contain /Tables/
    6. Configure Application Insights for the Function App using the Microsoft documentation.

Use Serverless Pool as a Source to read data

In this procedure, you will connect Jet to the data source.

  1. Open the Jet D365 project.
  2. Connect to Serverless Pool to Read Data:
    1. Under Business units, open the data source for Serverless.
    2. Right-click on the data source and select Edit SQL Server data source. The Edit SQL Server Data Source dialog box opens.
    3. Enter the Server name, the authentication credentials for the Serverless pool, and select the appropriate Database name.
    4. Click Test Connection to verify that you can connect to the data source successfully.


Was this article helpful?
1 out of 1 found this helpful