Skip to main content
cancel
Showing results for 
Search instead for 
Did you mean: 

We've captured the moments from FabCon & SQLCon that everyone is talking about, and we are bringing them to the community, live and on-demand. Starts on April 14th. Register now

Reply
Ira_27
Helper I
Helper I

Architecture question and brainstorming

Hello Community,

 

I have a design question that I need help with, we have a lot of vendor data that comes in form of CSV, XML, XLSX files via sftp. I already have ADF that copies this files into ADLS Gen2. Now when implementing in fabric I actually used blob storage events to trigger a notebook to load this files into a Lakehouse. I want to get suggestions if it's a good idea to use real time intelligence to stream data from this files and stage into a lakehouse? Has anyone implemented this? If so would love to get guidance. I am fairly new to evetstream and need more details on CU consumption, cost etc. what are disadvantages of using streaming notebook?

1 ACCEPTED SOLUTION
v-veshwara-msft
Community Support
Community Support

Hi @Ira_27 ,

Thanks for raising this in Microsoft Fabric Community.

Eventstream is primarily intended for real-time ingestion from sources like Event Hub, IoT Hub, or Kafka. In your case, since vendor data arrives as batch files via SFTP and is already being copied into ADLS, using Eventstream directly on those files is not the typical approach. This is because Eventstream operates on streaming records rather than directly reading and processing files from storage.

Microsoft Fabric Eventstreams Overview - Microsoft Fabric | Microsoft Learn

 

The blob-triggered notebook pattern you mentioned is a good fit for this scenario, as it processes files as they arrive and loads them into the Lakehouse efficiently. This aligns well with common batch or micro-batch ingestion patterns.

 

If you want to explore Real-Time Intelligence, you would first need to convert those files into events, for example by pushing rows into Event Hub or another streaming source, and then let Eventstream process them. This adds additional complexity and overhead without a clear benefit for file-based ingestion.

 

From a cost perspective, Eventstream typically runs continuously for streaming workloads and consumes capacity based on runtime and data throughput. Similarly, streaming notebooks remain active and consume compute even when data volume is low, which can make them more expensive and harder to manage compared to batch processing.

Microsoft Fabric Eventstreams Capacity Consumption - Microsoft Fabric | Microsoft Learn

Understanding CU consumption of streaming systems in Microsoft Fabric

 

For vendor data that is inherently batch-oriented, an event-driven batch approach (like your current design) is usually simpler, more cost-effective, and easier to maintain. Eventstream is better suited for scenarios where data is naturally generated as real-time events and requires low-latency processing.

 

Hope this helps. Please reach out for further assistance.
Thank you.

View solution in original post

4 REPLIES 4
tayloramy
Super User
Super User

Hi @Ira_27

 

I would not recommend using RTI for anything that isn't real time, while you probably could use RTI for batch based loads, it would be highly inefficient as the RTI processes would be running 24/7 looking for data. 

 





If you found this helpful, consider giving some Kudos.
If I answered your question or solved your problem, mark this post as the solution!

Proud to be a Super User!





Thank you for your response. I am aligned with your suggestion but still wanted to get a confirmation from the community. Appreciate your response.

v-veshwara-msft
Community Support
Community Support

Hi @Ira_27 ,

Thanks for raising this in Microsoft Fabric Community.

Eventstream is primarily intended for real-time ingestion from sources like Event Hub, IoT Hub, or Kafka. In your case, since vendor data arrives as batch files via SFTP and is already being copied into ADLS, using Eventstream directly on those files is not the typical approach. This is because Eventstream operates on streaming records rather than directly reading and processing files from storage.

Microsoft Fabric Eventstreams Overview - Microsoft Fabric | Microsoft Learn

 

The blob-triggered notebook pattern you mentioned is a good fit for this scenario, as it processes files as they arrive and loads them into the Lakehouse efficiently. This aligns well with common batch or micro-batch ingestion patterns.

 

If you want to explore Real-Time Intelligence, you would first need to convert those files into events, for example by pushing rows into Event Hub or another streaming source, and then let Eventstream process them. This adds additional complexity and overhead without a clear benefit for file-based ingestion.

 

From a cost perspective, Eventstream typically runs continuously for streaming workloads and consumes capacity based on runtime and data throughput. Similarly, streaming notebooks remain active and consume compute even when data volume is low, which can make them more expensive and harder to manage compared to batch processing.

Microsoft Fabric Eventstreams Capacity Consumption - Microsoft Fabric | Microsoft Learn

Understanding CU consumption of streaming systems in Microsoft Fabric

 

For vendor data that is inherently batch-oriented, an event-driven batch approach (like your current design) is usually simpler, more cost-effective, and easier to maintain. Eventstream is better suited for scenarios where data is naturally generated as real-time events and requires low-latency processing.

 

Hope this helps. Please reach out for further assistance.
Thank you.

Appreciate your detailed response on it and thank you for confirming my current architecture. 

Helpful resources

Announcements
FabCon and SQLCon Highlights Carousel

FabCon &SQLCon Highlights

Experience the highlights from FabCon & SQLCon, available live and on-demand starting April 14th.

New to Fabric survey Carousel

New to Fabric Survey

If you have recently started exploring Fabric, we'd love to hear how it's going. Your feedback can help with product improvements.

Join our Fabric User Panel

Join our Fabric User Panel

Share feedback directly with Fabric product managers, participate in targeted research studies and influence the Fabric roadmap.

March Fabric Update Carousel

Fabric Monthly Update - March 2026

Check out the March 2026 Fabric update to learn about new features.

Top Solution Authors