Getting Data In

Http event collector vs monitor directory

signumpl
Engager

I have Splunk Universal Forwarder installed on raspberry pi and couple of apps from which I want to send logs to forwarder. What is the best and most efficient way to do this? I was thinking of:

  1. Http event collector
  2. Monitor local directories where apps are storing their logs in json format (large files)
  3. I cannot use tcp because there is no .net core library for this purpose

Also, target Splunk instance to which forwarder sends data, is often offline, so fowarder needs to buffer big amount of logs. That's why I thought that monitoring files will be the best approach here but i'm not sure.

0 Karma
1 Solution

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!

View solution in original post

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...