Getting Data In

Http event collector vs monitor directory

signumpl
Engager

I have Splunk Universal Forwarder installed on raspberry pi and couple of apps from which I want to send logs to forwarder. What is the best and most efficient way to do this? I was thinking of:

  1. Http event collector
  2. Monitor local directories where apps are storing their logs in json format (large files)
  3. I cannot use tcp because there is no .net core library for this purpose

Also, target Splunk instance to which forwarder sends data, is often offline, so fowarder needs to buffer big amount of logs. That's why I thought that monitoring files will be the best approach here but i'm not sure.

0 Karma
1 Solution

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!

View solution in original post

nickhills
Ultra Champion

Option 2 is your best bet.

Inputs using a monitor input stanza will simply pause if the indexer is offline, and resume when it is back.
That means that you only need enough space on the ras-pi to store the logs, you don't need to worry about buffers or persistent queues

If my comment helps, please give it a thumbs up!
Get Updates on the Splunk Community!

Routing Data to Different Splunk Indexes in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. The OpenTelemetry project is the second largest ...

Getting Started with AIOps: Event Correlation Basics and Alert Storm Detection in ...

Getting Started with AIOps:Event Correlation Basics and Alert Storm Detection in Splunk IT Service ...

Register to Attend BSides SPL 2022 - It's all Happening October 18!

Join like-minded individuals for technical sessions on everything Splunk!  This is a community-led and run ...