Getting Data In

Http event collector vs monitor directory

signumpl
Engager

I have Splunk Universal Forwarder installed on raspberry pi and couple of apps from which I want to send logs to forwarder. What is the best and most efficient way to do this? I was thinking of:

  1. Http event collector
  2. Monitor local directories where apps are storing their logs in json format (large files)
  3. I cannot use tcp because there is no .net core library for this purpose

Also, target Splunk instance to which forwarder sends data, is often offline, so fowarder needs to buffer big amount of logs. That's why I thought that monitoring files will be the best approach here but i'm not sure.

0 Karma
1 Solution

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!

View solution in original post

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!
Get Updates on the Splunk Community!

Splunk + ThousandEyes: Correlate frontend, app, and network data to troubleshoot ...

 Are you tired of troubleshooting delays caused by siloed frontend, application, and network data? We've got a ...

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

🔐 Trust at Every Hop: How mTLS in Splunk Enterprise 10.0 Makes Security Simpler

From Idea to Implementation: Why Splunk Built mTLS into Splunk Enterprise 10.0  mTLS wasn’t just a checkbox ...