Monitoring Splunk

how can I improve indexing performance of Splunk monitoring huge number of fies?

Takajian
Builder

My application generates around 100,000 files per day. Although I tested to index them by monitoring files, it took almost a week. Anybody know cause of the issue and solution? I think it will be improved if number of active log files were not so many, but I have to do so in our environment.

Tags (1)
0 Karma

sbrant_splunk
Splunk Employee
Splunk Employee

It is possible to run multiple instances of Splunk forwarders. This is particularly easy on *nix systems. With this number of files, you may want to investigate this solution.

0 Karma

sbrant_splunk
Splunk Employee
Splunk Employee

There is no hard limit and it will require some testing to discover the breaking point. This question, looking for the same answer, basically says the same:

http://splunk-base.splunk.com/answers/57806/is-there-a-limit-on-the-number-of-files-a-forwarder-can-...

How are the files being written now? Are they in separate directories? Is there a naming convention?

0 Karma

macnica
Engager

Is there any sizing guideline of number of files that a forwarder can monitor? I would like to make plan how many forwarders is required for my case.

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...