Getting Data In

Http event collector vs monitor directory

signumpl
Engager

I have Splunk Universal Forwarder installed on raspberry pi and couple of apps from which I want to send logs to forwarder. What is the best and most efficient way to do this? I was thinking of:

  1. Http event collector
  2. Monitor local directories where apps are storing their logs in json format (large files)
  3. I cannot use tcp because there is no .net core library for this purpose

Also, target Splunk instance to which forwarder sends data, is often offline, so fowarder needs to buffer big amount of logs. That's why I thought that monitoring files will be the best approach here but i'm not sure.

0 Karma
1 Solution

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!

View solution in original post

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!
Get Updates on the Splunk Community!

Introducing Ingest Actions: Filter, Mask, Route, Repeat

WATCH NOW Ingest Actions (IA) is the best new way to easily filter, mask and route your data in Splunk® ...

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...