Monitoring Splunk

HeavyForwarder - Audit batch input of zip files

yko84109
Loves-to-Learn

We are monitoring through HF a directory through 'batch' input.
The directory contains 100s of zip files, each zip contains 1000s of log files.
We have a requirement to write down each log file separately, after it was being forwarded.

1. We tried to use splunkd.log in the HF
The component "ArchiveProcessor" does not write each log file within the zip.
The component "Metrics" holds stastics of top 10 every 30 seconds, so it does not write every file.
Can we change the 'Metrics' paremeters, to write top X and not top 10?
Is there any other component we can use, even in DEBUG mode, to get that information?


2. We tried to approach this through the index itself in the indexer.
The log files timestamp is not close to 'now', but is spreaded across many years, so it is very slow and not efficient to search them through the index.
For example: (| tstats count where earliest=0 latest=now index=myindex by source)

Labels (2)
0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...