Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hello Fabric Community,
I am implementing a metadata-driven pipeline and would like to copy ZIP files from an on-premises folder to multiple lakehouses. I plan to execute this pipeline via REST using a Service Principal (SP).
To achieve this, I created a gateway for the on-premises folder, granted the SP access to it, and assigned the SP a Contributor role on the target lakehouses. In the Copy activity, I configured the gateway as the source connection. For the destination, I attempted to dynamically connect to the target lakehouses, as shown below.
under destination: I choose '(none)' --> 'lakehouse-Id as dynamic content' --> 'workspaceId as dynamic content' --> then folder path 'temp' -->filename '*'
However, this approach is not working, and I am encountering a “Forbidden” error. Could you please advise how to load data into multiple lakehouses using a single pipeline?
error:
ErrorCode=LakehouseForbiddenError,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Lakehouse failed for forbidden which may be caused by user account or service principal doesn't have enough permission to access Lakehouse. Workspace: '94da8e6f-2aff-4cc1-b335-56800fe44179'. Path: 'd3c181f6-cb5f-4003-94af-2a8b9be76b2c/Files/temp/2022 neu.zip'. ErrorCode: 'Forbidden'. Message: 'Forbidden'. TimeStamp: 'Wed, 18 Mar 2026 18:44:45 GMT'..,Source=Microsoft.DataTransfer.ClientLibrary,''Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Operation returned an invalid status code 'Forbidden',Source=,''Type=Microsoft.Azure.Storage.Data.Models.ErrorSchemaException,Message=Operation returned an invalid status code 'Forbidden',Source=Microsoft.DataTransfer.ClientLibrary,'Any help would be greatly appreciated.
Hi @Kuladeep,
We are following up once again regarding your query. Could you please confirm if the issue has been resolved through the support ticket with Microsoft?
If the issue has been resolved, we kindly request you to share the resolution or key insights here to help others in the community. Should you need further assistance in the future, we encourage you to reach out via the Microsoft Fabric Community Forum and create a new thread. We’ll be happy to help.
Thank you for your understanding and participation.
If this issue is blocking you in this scenario i suggest you to raise a support ticket here. so, that they can assit you in addressing the issue you are facing. please follow below link on how to raise a support ticket:
How to create a Fabric and Power BI Support ticket - Power BI | Microsoft Learn
Hi @Kuladeep,
This behavior is expected with the current Microsoft Fabric security model. While the Copy activity supports using dynamic workspaceId and lakehouseId, permissions are enforced at runtime against the resolved target workspace. If the Service Principal doesn’t have Contributor access on the target workspace, the write operation fails with LakehouseForbiddenError (403).
Sharing a Lakehouse item is not sufficient for write operations—Lakehouse sharing only grants read access. Writing files via Copy activity requires workspace‑level permissions on each target Lakehouse’s workspace.
try workarounds:
This is by design, not a bug. If broader dynamic cross‑workspace ingestion without pre‑granting permissions is required
references:
Thanks,
Prashanth
Thanks for your reply.
The service principal (SP) already has Contributor access to the target workspace and lakehouse, as mentioned in the description. However, the process is still failing with a 403 error.
As a workaround, I have implemented a solution using a landing lakehouse and a connected notebook to move files to the target lakehouse.
Nevertheless, I would like to understand why I am unable to use a dynamic connection in the copy activity when triggering it via the SP. Why is it failing despite having all the required permissions?
Thanks for your reply.
The service principal (SP) already has Contributor access to the target workspace and lakehouse, as mentioned in the description. However, the process is still failing with a 403 error.
As a workaround, I have implemented a solution using a landing lakehouse and a connected notebook to move files to the target lakehouse.
Nevertheless, I would like to understand why I am unable to use a dynamic connection in the copy activity when triggering it via the SP. Why is it failing despite having all the required permissions?
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 2 | |
| 1 | |
| 1 | |
| 1 | |
| 1 |