Getting Data In

Indexing all text log files in a directory

ankit
Explorer

I have a directory with about 750 log files. The files are all text files and the total size of this directory is 117 GB. I need to index the files once (not continuously). 

Would the best option to index all the files in the directory be to copy the files to my Splunk instance and then use the directory input option from Splunk Web i.e. Settings > Data Inputs > Files & Directories ? 

Any other recommended options ?

Labels (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @ankit,

if you already have a Forwarder on the system where files are stored you can input them there, in this way you optimize the process using the compressing features of Splunk and leave Splunk work for you.

As you can think it will tale a long time!

If you haven't a Forwarder on that system, I hint to compress your files and copy rhem on a folder of the indexer, then uncompress and index as you said.

Probably the second way is the quicker, but the first is easier.

Ciao.

Giuseppe

View solution in original post

ankit
Explorer

Thanks @gcusello @aasabatini . I will be copying the compressed directory to the server, uncompress them and selecting the Directory input method from Splunk Web. 

0 Karma

ankit
Explorer

I went ahead and tried a different approach. I had each of the 749 files compressed as log.gz files beforehand. So just copied those compressed files to a directory on my Splunk instance and added a directory input. Worked like a charm ! 

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @ankit,

if you already have a Forwarder on the system where files are stored you can input them there, in this way you optimize the process using the compressing features of Splunk and leave Splunk work for you.

As you can think it will tale a long time!

If you haven't a Forwarder on that system, I hint to compress your files and copy rhem on a folder of the indexer, then uncompress and index as you said.

Probably the second way is the quicker, but the first is easier.

Ciao.

Giuseppe

aasabatini
Motivator

Hi @ankit 

you can collect all the logs based on your directory.

you need to go 

Settings > Data Inputs > Files & Directories 

click on

aasabatini_0-1620626236646.png

 

add your path on the form and select index once

Example path: /var/log

aasabatini_1-1620626588468.png

Considerations

You can collect more than 100 Gb if you have a  properly license.

In another case if you violates the license several time, splunk will be blocked.

“The answer is out there, Neo, and it’s looking for you, and it will find you if you want it to.”
Tags (1)
Get Updates on the Splunk Community!

October Community Champions: A Shoutout to Our Contributors!

As October comes to a close, we want to take a moment to celebrate the people who make the Splunk Community ...

Community Content Calendar, November Edition

Welcome to the November edition of our Community Spotlight! Each month, we dive into the Splunk Community to ...

Stay Connected: Your Guide to November Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...