Don't miss your chance to take the Fabric Data Engineer (DP-700) exam on us!
Learn moreWe've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now
Hi
I want to stream data into a KQL database using a Streaming Spark Job. My initial plan was to retrieve a service principal secret from a key vault using:
notebookutils.credentials.getSecret()
but this doesn't appear to be supported. It doesn't look like workspace managed identities are supported for KQL authentication yet either.
Is there any other way to do this?
Hi @going_grey ,
Thanks for reaching out to Microsoft Fabric Community.
From your scenario, using a Service Principal for KQL authentication is valid. The limitation is in Spark execution, where streaming or scheduled jobs do not have an interactive user context and currently do not support Service Principal or workspace identity for secret retrieval or KQL authentication. As a result, even if the secret is stored correctly in Key Vault, it cannot be used in a fully supported way during Spark streaming execution.
As a workaround, you can authenticate to Key Vault programmatically using the Azure SDK (for example via azure-identity and Key Vault client libraries) and retrieve the secret using a Service Principal instead of relying on notebookutils.credentials.getSecret(). This works in Spark jobs, but you will still need to securely pass and manage the client secret.
Reference: Access Azure Key Vault secrets from MS Fabric Notebooks
For production scenarios, you may want to consider handling authentication and ingestion outside Spark, such as using Fabric Data Pipeline or Eventstream.
For reference:
NotebookUtils (former MSSparkUtils) for Fabric - Microsoft Fabric | Microsoft Learn
Also, you can explore Fabric Connections in notebooks to define connections using Service Principal or workspace identity.
Hope this helps. Please reach out for further assistance.
Thank you.
@v-veshwara-msft wrote:As a workaround, you can authenticate to Key Vault programmatically using the Azure SDK (for example via azure-identity and Key Vault client libraries) and retrieve the secret using a Service Principal instead of relying on notebookutils.credentials.getSecret(). This works in Spark jobs, but you will still need to securely pass and manage the client secret.
If you use the Azure SDK with a service principal to retrieve "the secret you actually need", then you'd still need to provide a secret to establish the key vault connection (which also needs to be secured).
Hi @going_grey ,
Thanks for pointing that out, you’re correct.
Even with the SDK approach, a credential is still required to authenticate to Key Vault, so it does not remove the need to securely store and pass a secret.
Because of this, there seems to be isn’t a fully secure or secretless pattern available today for this scenario in Spark streaming with KQL, and this is a current limitation rather than a configuration issue.
For production scenarios, approaches where authentication is handled outside Spark (for example using Fabric Data Pipeline or Eventstream) tend to be more reliable with the current capabilities.
Hope this helps. Please reach out for further assistance.
Thank you.
Hello @going_grey
You’re correct that notebookutils.credentials.getSecret(vault_url, secret_name) is the supported way to read a secret from Azure Key Vault in a Fabric notebook. However, it’s important to separate Key Vault access from KQL authentication support.
When you call getSecret, the identity used depends on how the notebook is executed:
That identity must have Key Vault Secrets User permissions on the vault.
However, even if you successfully retrieve the service principal secret, this does not currently unblock streaming into a KQL database. At present:
As a result, using Key Vault to retrieve a service principal secret from a Spark streaming job does not provide a fully supported authentication path into KQL today. Until KQL supports Workspace Managed Identity or Fabric-native identities for Spark ingestion, this remains a platform limitation rather than a Key Vault issue.
Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.
If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.
Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.
| User | Count |
|---|---|
| 13 | |
| 8 | |
| 7 | |
| 5 | |
| 3 |