Getting Data In

Monitor mode failed silently on a group of large files. When do you switch to batch mode and how does it differ?

thisissplunk
Builder

My saml environment is one search head/indexer box, one indexer peer box and one forwarder.

I placed about 30gb worth of .gz logs (15 files total) into the monitor directory on the forwarder. Splunkd.log said it handled, read and processed the each log correctly. However, only about half of the files (sources) actually made their way to the indexers. Why is this?. The solution was to copy each file into the monitor directory manually, wait for it to finish processing, then copy the next one in.

Secondly, how does batch differ from monitor and would it solve this problem?

Here are some settings on the forwarder:

maxkbps = 0
max_mem_usage_mb= 200
parallelIngestionpipelines = 1
0 Karma
1 Solution

thisissplunk
Builder

The "failure" was due to not having the index setup on our second indexer during ingestion. We thought that the forwarder would be smart enough to know not to send data to a peer if the index didn't exist on it.

Our fault.

View solution in original post

0 Karma

thisissplunk
Builder

The "failure" was due to not having the index setup on our second indexer during ingestion. We thought that the forwarder would be smart enough to know not to send data to a peer if the index didn't exist on it.

Our fault.

0 Karma

adonio
Ultra Champion

hello there,

for the second question:
from inputs.conf.spec:

NOTE: Batch should only be used for large archives of historic data. If you
want to continuously monitor a directory or index small archives, use 'monitor'
(see above). 'batch' reads in the file and indexes it, and then deletes the
file on disk.

    [batch://<path>]
    * A one-time, destructive input of files in <path>.
    * For continuous, non-destructive inputs of files, use 'monitor' instead.

as for the first question,
did you get a massage like file is to large, waiting ... on the forwarder?

hope it semi helps

Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...