Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

Join us at the 2025 Microsoft Fabric Community Conference. March 31 - April 2, Las Vegas, Nevada. Use code FABINSIDER for $400 discount. Register now

Reply
ShahrukhHB
Frequent Visitor

Setup Lakehouse with Business Central

Hi Fabric Community


We Recently Moved to Business Central SaaS Earlier We Were using On-Prem.
Now My Question is i am planning to build a Lakehouse for Business Central.

I am currently Using Web Service(API) from Business Central and Getting data into Data Flow, But when we setup Incremental Refresh in Dataflow then it won't Allow me Destination as Lakehouse it asking me to Warehouse or Sql but not Giving Lakehouse Option.
What will be the best apporach to Load data into Lakehouse or other best option Rather than Lakehouse.
How to Implement Incremental Refresh with API's 


1 ACCEPTED SOLUTION
v-ssriganesh
Community Support
Community Support

Hi @ShahrukhHB,
Thank you for reaching out to the Microsoft Fabric community forum. I appreciate @lbendlin for the excellent suggestion on shortcuts and mirroring.

Currently, Dataflows in Fabric do not support Lakehouse as a destination when using incremental refresh. The available destinations for incremental refresh are Fabric Warehouse, Azure SQL Database, Azure Synapse Analytics.

While Lakehouse is not a direct destination for incremental refresh, you can still use it in combination with incremental refresh by:

  • Staging the incrementally refreshed data in a supported destination.
  • Using a second query to reference the staged data and update the Lakehouse.

This method allows you to benefit from incremental refresh to reduce the data processing load from the source system. However, note that a full refresh is required when loading from the staged data into the Lakehouse.

For your reference: https://learn.microsoft.com/en-us/fabric/data-factory/dataflow-gen2-incremental-refresh#destination-...

Alternatively, you can utilize Data Factory Pipelines in Fabric to extract data from the Business Central using a pipeline, store the data in a staging Delta table in the Lakehouse, and manually apply incremental logic.

If this helps then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you.

View solution in original post

5 REPLIES 5
v-ssriganesh
Community Support
Community Support

Hi @ShahrukhHB,

May I ask if you have resolved this issue? If so, please mark it as the solution. This will be helpful for other community members who have similar problems to solve it faster.

Thank you.

Kieran
Advocate II
Advocate II

When I looked at this challenge myself recently I discovered that Fabric Gen 2 Data Flows does not yet support Business Central as a Data Source for Incremental Load. So I agree with v-ssriganesh that you should consider Data Pipelines for incremental load. However the set up of connecting Data Pipelines to Business Central is more demanding technically. If your data sets are relatively small and / or you can negociate over-night data refresh then a daily full load might be sufficient. 

olivs
Most Valuable Professional
Most Valuable Professional

Hi,

 

I have been in doubt about doing the same, and I have done some research about this topic. 

From my point of view, the "easiest" solution at this moment is to link your Business Central SaaS environment to to Dataverse. And then connect to Dataverse from Microsoft Fabric.

 

If you would like more information about this:

Connect to Microsoft Dataverse - Business Central | Microsoft Learn

Link your Dataverse environment to Microsoft Fabric and unlock deep insights - Power Apps | Microsof...

 

I hope this helps or at least provides you an alternative way to work with your Dynamics 365 Business Central data in Microsoft Fabric.

v-ssriganesh
Community Support
Community Support

Hi @ShahrukhHB,
Thank you for reaching out to the Microsoft Fabric community forum. I appreciate @lbendlin for the excellent suggestion on shortcuts and mirroring.

Currently, Dataflows in Fabric do not support Lakehouse as a destination when using incremental refresh. The available destinations for incremental refresh are Fabric Warehouse, Azure SQL Database, Azure Synapse Analytics.

While Lakehouse is not a direct destination for incremental refresh, you can still use it in combination with incremental refresh by:

  • Staging the incrementally refreshed data in a supported destination.
  • Using a second query to reference the staged data and update the Lakehouse.

This method allows you to benefit from incremental refresh to reduce the data processing load from the source system. However, note that a full refresh is required when loading from the staged data into the Lakehouse.

For your reference: https://learn.microsoft.com/en-us/fabric/data-factory/dataflow-gen2-incremental-refresh#destination-...

Alternatively, you can utilize Data Factory Pipelines in Fabric to extract data from the Business Central using a pipeline, store the data in a staging Delta table in the Lakehouse, and manually apply incremental logic.

If this helps then please Accept it as a solution and dropping a "Kudos" so other members can find it more easily.
Thank you.

lbendlin
Super User
Super User

First investigate if shortcuts or mirroring are an option.

 

Referencing data to a Lakehouse using shortcuts - Microsoft Fabric | Microsoft Learn

Helpful resources

Announcements