Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
Kuladeep
Frequent Visitor

How to create dynamic lakehouse connection in copy activity

Hello Fabric Community,

I am implementing a metadata-driven pipeline and would like to copy ZIP files from an on-premises folder to multiple lakehouses. I plan to execute this pipeline via REST using a Service Principal (SP).

To achieve this, I created a gateway for the on-premises folder, granted the SP access to it, and assigned the SP a Contributor role on the target lakehouses. In the Copy activity, I configured the gateway as the source connection. For the destination, I attempted to dynamically connect to the target lakehouses, as shown below.

Kuladeep_1-1773859998499.png 

Kuladeep_2-1773860068588.pngKuladeep_3-1773860121823.png

under destination: I choose '(none)' --> 'lakehouse-Id as dynamic content' --> 'workspaceId as dynamic content' --> then folder path 'temp' -->filename '*'

 

However, this approach is not working, and I am encountering a “Forbidden” error. Could you please advise how to load data into multiple lakehouses using a single pipeline?
error: 

ErrorCode=LakehouseForbiddenError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Lakehouse failed for forbidden which may be caused by user account or service principal doesn't have enough permission to access Lakehouse. Workspace: '94da8e6f-2aff-4cc1-b335-56800fe44179'. Path: 'd3c181f6-cb5f-4003-94af-2a8b9be76b2c/Files/temp/2022 neu.zip'. ErrorCode: 'Forbidden'. Message: 'Forbidden'. TimeStamp: 'Wed, 18 Mar 2026 18:44:45 GMT'..,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Operation returned an invalid status code 'Forbidden',Source=,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'Forbidden',Source=Microsoft.DataTransfer.ClientLibrary,'

Any help would be greatly appreciated.

5 REPLIES 5
v-prasare
Community Support
Community Support

Hi @Kuladeep,

We are following up once again regarding your query. Could you please confirm if the issue has been resolved through the support ticket with Microsoft?
If the issue has been resolved, we kindly request you to share the resolution or key insights here to help others in the community. Should you need further assistance in the future, we encourage you to reach out via the Microsoft Fabric Community Forum and create a new thread. We’ll be happy to help.

Thank you for your understanding and participation.

 

v-prasare
Community Support
Community Support

If this issue is blocking you in this scenario i suggest you to raise a support ticket here. so, that they can assit you in addressing the issue you are facing. please follow below link on how to raise a support ticket:

How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn

v-prasare
Community Support
Community Support

Hi @Kuladeep,

This behavior is expected with the current Microsoft Fabric security model. While the Copy activity supports using dynamic workspaceId and lakehouseId, permissions are enforced at runtime against the resolved target workspace. If the Service Principal doesn’t have Contributor access on the target workspace, the write operation fails with LakehouseForbiddenError (403).

Sharing a Lakehouse item is not sufficient for write operations—Lakehouse sharing only grants read access. Writing files via Copy activity requires workspace‑level permissions on each target Lakehouse’s workspace.

 

try workarounds:

  • Add the Service Principal as Contributor to every target workspace that the pipeline writes to.
  • Use a central landing Lakehouse (where the Service Principal has access) and distribute data downstream using Fabric pipelines or notebooks.
  • Create separate pipelines per workspace with fixed Lakehouse connections to avoid dynamic permission evaluation.

This is by design, not a bug. If broader dynamic cross‑workspace ingestion without pre‑granting permissions is required

 

references:

 

 

 

Thanks,

Prashanth

Thanks for your reply.

The service principal (SP) already has Contributor access to the target workspace and lakehouse, as mentioned in the description. However, the process is still failing with a 403 error.

As a workaround, I have implemented a solution using a landing lakehouse and a connected notebook to move files to the target lakehouse.

Nevertheless, I would like to understand why I am unable to use a dynamic connection in the copy activity when triggering it via the SP. Why is it failing despite having all the required permissions?

Thanks for your reply.

The service principal (SP) already has Contributor access to the target workspace and lakehouse, as mentioned in the description. However, the process is still failing with a 403 error.

As a workaround, I have implemented a solution using a landing lakehouse and a connected notebook to move files to the target lakehouse.

Nevertheless, I would like to understand why I am unable to use a dynamic connection in the copy activity when triggering it via the SP. Why is it failing despite having all the required permissions?

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.