Getting Data In

Http event collector vs monitor directory

signumpl
Engager

I have Splunk Universal Forwarder installed on raspberry pi and couple of apps from which I want to send logs to forwarder. What is the best and most efficient way to do this? I was thinking of:

  1. Http event collector
  2. Monitor local directories where apps are storing their logs in json format (large files)
  3. I cannot use tcp because there is no .net core library for this purpose

Also, target Splunk instance to which forwarder sends data, is often offline, so fowarder needs to buffer big amount of logs. That's why I thought that monitoring files will be the best approach here but i'm not sure.

0 Karma
1 Solution

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!

View solution in original post

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...