Getting Data In

Indexing all text log files in a directory

ankit
Explorer

I have a directory with about 750 log files. The files are all text files and the total size of this directory is 117 GB. I need to index the files once (not continuously). 

Would the best option to index all the files in the directory be to copy the files to my Splunk instance and then use the directory input option from Splunk Web i.e. Settings > Data Inputs > Files & Directories ? 

Any other recommended options ?

Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @ankit,

if you already have a Forwarder on the system where files are stored you can input them there, in this way you optimize the process using the compressing features of Splunk and leave Splunk work for you.

As you can think it will tale a long time!

If you haven't a Forwarder on that system, I hint to compress your files and copy rhem on a folder of the indexer, then uncompress and index as you said.

Probably the second way is the quicker, but the first is easier.

Ciao.

Giuseppe

View solution in original post

ankit
Explorer

Thanks @gcusello @aasabatini . I will be copying the compressed directory to the server, uncompress them and selecting the Directory input method from Splunk Web. 

0 Karma

ankit
Explorer

I went ahead and tried a different approach. I had each of the 749 files compressed as log.gz files beforehand. So just copied those compressed files to a directory on my Splunk instance and added a directory input. Worked like a charm ! 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @ankit,

if you already have a Forwarder on the system where files are stored you can input them there, in this way you optimize the process using the compressing features of Splunk and leave Splunk work for you.

As you can think it will tale a long time!

If you haven't a Forwarder on that system, I hint to compress your files and copy rhem on a folder of the indexer, then uncompress and index as you said.

Probably the second way is the quicker, but the first is easier.

Ciao.

Giuseppe

aasabatini
Motivator

Hi @ankit 

you can collect all the logs based on your directory.

you need to go 

Settings > Data Inputs > Files & Directories 

click on

aasabatini_0-1620626236646.png

 

add your path on the form and select index once

Example path: /var/log

aasabatini_1-1620626588468.png

Considerations

You can collect more than 100 Gb if you have a  properly license.

In another case if you violates the license several time, splunk will be blocked.

“The answer is out there, Neo, and it’s looking for you, and it will find you if you want it to.”
Tags (1)
Get Updates on the Splunk Community!

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

Watch On Demand the Tech Talk on November 6 at 11AM PT, and empower your SOC to reach new heights! Duration: ...

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...