Getting Data In

Indexing all text log files in a directory

ankit
Explorer

I have a directory with about 750 log files. The files are all text files and the total size of this directory is 117 GB. I need to index the files once (not continuously). 

Would the best option to index all the files in the directory be to copy the files to my Splunk instance and then use the directory input option from Splunk Web i.e. Settings > Data Inputs > Files & Directories ? 

Any other recommended options ?

Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @ankit,

if you already have a Forwarder on the system where files are stored you can input them there, in this way you optimize the process using the compressing features of Splunk and leave Splunk work for you.

As you can think it will tale a long time!

If you haven't a Forwarder on that system, I hint to compress your files and copy rhem on a folder of the indexer, then uncompress and index as you said.

Probably the second way is the quicker, but the first is easier.

Ciao.

Giuseppe

View solution in original post

ankit
Explorer

Thanks @gcusello @aasabatini . I will be copying the compressed directory to the server, uncompress them and selecting the Directory input method from Splunk Web. 

0 Karma

ankit
Explorer

I went ahead and tried a different approach. I had each of the 749 files compressed as log.gz files beforehand. So just copied those compressed files to a directory on my Splunk instance and added a directory input. Worked like a charm ! 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @ankit,

if you already have a Forwarder on the system where files are stored you can input them there, in this way you optimize the process using the compressing features of Splunk and leave Splunk work for you.

As you can think it will tale a long time!

If you haven't a Forwarder on that system, I hint to compress your files and copy rhem on a folder of the indexer, then uncompress and index as you said.

Probably the second way is the quicker, but the first is easier.

Ciao.

Giuseppe

aasabatini
Motivator

Hi @ankit 

you can collect all the logs based on your directory.

you need to go 

Settings > Data Inputs > Files & Directories 

click on

aasabatini_0-1620626236646.png

 

add your path on the form and select index once

Example path: /var/log

aasabatini_1-1620626588468.png

Considerations

You can collect more than 100 Gb if you have a  properly license.

In another case if you violates the license several time, splunk will be blocked.

“The answer is out there, Neo, and it’s looking for you, and it will find you if you want it to.”
Tags (1)
Get Updates on the Splunk Community!

September Community Champions: A Shoutout to Our Contributors!

As we close the books on another fantastic month, we want to take a moment to celebrate the people who are the ...

Splunk Decoded: Service Maps vs Service Analyzer Tree View vs Flow Maps

It’s Monday morning, and your phone is buzzing with alert escalations – your customer-facing portal is running ...

What’s New in Splunk Observability – September 2025

What's NewWe are excited to announce the latest enhancements to Splunk Observability, designed to help ITOps ...