Getting Data In

How can i automate indexing of logs to splunk web?

sahoo0233
Path Finder

Hi everyone,

My everyday process is to upload logs to splunk web and take a report and analyse it.

So in this, 1st logs will be getting delivered from a third party to our internal splunk server(Unix box) , so from here i extract the required data and make duplicate logs.
Then i sign in to splunk web and add data i.e., upload all the duplicate logs (indexing).

So my question is, all this is manual process. So how can i automate this thing in the server(unix box) so that it automatically indexes to splunk web???

Tags (2)
0 Karma
1 Solution

jackson1990
Path Finder

Hi @sahoo0233,
To Automate indexing of logs,follow the below steps:
*Create a file named inputs.conf in $SPLUNK_HOME/etc/system/local/
*Add the following stanza in the file

[monitor://path]
disabled=false
followTail=true
host={YOUR-HOST-NAME}
index={Index-Name}(include this line if you have created an index for your requirement,or else skip,it ll go to default index)
sourcetype={SourceType-Name}

For EX:

[monitor:///opt/IBM/WebSphere/Plugins/logs/server.log]
disabled=false
followTail=true
host=sample-pc
index=public60
sourcetype=serverLog

After doing the above configurations,restart splunk using the command splunk restart in $SPLUNK_HOME/bin/.
Hope this is what u are expecting.let me know if you face any problems.

View solution in original post

jackson1990
Path Finder

Hi @sahoo0233,
To Automate indexing of logs,follow the below steps:
*Create a file named inputs.conf in $SPLUNK_HOME/etc/system/local/
*Add the following stanza in the file

[monitor://path]
disabled=false
followTail=true
host={YOUR-HOST-NAME}
index={Index-Name}(include this line if you have created an index for your requirement,or else skip,it ll go to default index)
sourcetype={SourceType-Name}

For EX:

[monitor:///opt/IBM/WebSphere/Plugins/logs/server.log]
disabled=false
followTail=true
host=sample-pc
index=public60
sourcetype=serverLog

After doing the above configurations,restart splunk using the command splunk restart in $SPLUNK_HOME/bin/.
Hope this is what u are expecting.let me know if you face any problems.

sahoo0233
Path Finder

Hi Jack, Thanks dude for your solution, i will try it and will let you know soon.

Is it neccesary to add index in the config???

0 Karma

jackson1990
Path Finder

if u have created a seperate index,you can add the line..or else u can skip that line

0 Karma

sahoo0233
Path Finder

Hi jackson,

Thanks dude, this is working fine.

So lastly i have one more doubt i.e., previously i used to upload logs manually so i was able to find those log files in splunk web, so as of now when we place a file in a specific directory automatically it gets indexed right, so if i delete the files from that directory will this make my indexed files removed from splunkd or else this will not affect????

0 Karma

jackson1990
Path Finder

Hi @sahoo0233,
Can you please accept the answer,if the indexing thorugh inputs.conf file is working fine?

0 Karma

sahoo0233
Path Finder

Hi jackson, Thanks for the help. I have accepted the answer.

I have another query of mine which has been pending from many days, could you please try to help :

http://answers.splunk.com/answers/241211/why-is-my-scheduled-alert-for-my-dashboard-failing.html#com...

0 Karma

laserval
Communicator

The data in Splunk will stay even if you remove the files that you indexed.

The only way to remove data from Splunk is to pipe the results from a search to the delete command (for a user that has the can_delete role). Or to remove the index itself.

0 Karma

sahoo0233
Path Finder

Ok Thanks,

previously when i did indexing manually i have some files saved in manager-datainputs-files & directories-my logfile

So if i delete "my logfile" will this remove the indexing file and where exactly can i see these "my logfile" in splunk unix box??

0 Karma

jackson1990
Path Finder

Hi @sahoo0233,
Removing the log file from system after indexing,wont remove the indexed data.
As @laserval mentioned,You can remove the indexed data using delete command.

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Rather than duplicate the logs, have Splunk monitor the directory where the third party delivers them. It's easy to do using pretty much the same steps you take with the manual process - just select the option to have Splunk watch the directory and automatically index the files it finds.

---
If this reply helps you, Karma would be appreciated.

sahoo0233
Path Finder

Hi richgalloway,

Thanks for the fast reply. Actually i am new to splunk enterprise so i do everything in maual.
Could you please elaborate the thing what you said like "monitor the directory where the third party delivers them" . How does a splunk monitor? where to set?

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Select Settings->Data Inputs and click the Add Data button. Choose "From files and directories". Browse to the location of the log files you want indexed and click Continue. Preview the data and make changes as required then click Continue. Select the "Continuously index data from a file or directory this Splunk instance can access" option, make sure the "Full path to your data" is a directory rather than a file, complete the rest of the form, and click Save.

---
If this reply helps you, Karma would be appreciated.
0 Karma

sahoo0233
Path Finder

Hey this is waht i do, i am asking how can i automate the whole process??

0 Karma

richgalloway
SplunkTrust
SplunkTrust

This is how you automate the process. Once Splunk is set up to monitor a directory, it doesn't need to be done again. If you're having to repeat the process then you must be doing something different.

---
If this reply helps you, Karma would be appreciated.
0 Karma

sahoo0233
Path Finder

Ok fine i got you, this is the process i do.

But what i need is is there any process so that everything gets automated???

when the third party delivers the logs to my internal splunk server box, ill setup a cron job for the extracting process etc, then without all the maual process like adding data and all it should get indexed automatically. So is there anything that you can suggest me like this ?? Any idea taht you have?

0 Karma

richgalloway
SplunkTrust
SplunkTrust

I think I haven't made myself clear. Once you set up Splunk to monitor the directory on your internal Splunk server box where the third party places the logs, you shouldn't need the cron job (unless the job does things Splunk cannot). You've essentially told Splunk 'keep an eye on this directory and index any file that is placed there'.

This can also be done by editing the inputs.conf file, but I don't recommend that for novices.

---
If this reply helps you, Karma would be appreciated.
0 Karma

sahoo0233
Path Finder

Thanks for you contionous support,

Ok i did the same thing what you told, infact i am doing it everyday. Even when i am marking it "Continuously index data from a file or directory this Splunk instance can access" but still i need to manually upload everyday. it doesnt take automatically...

So is there any problem with my splunk ?? OR whats is there that i can do with inputs.conf????

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Something is wrong if you have to do this every day. Is the directory name changing each time? I would expect the third party is putting log files in the same place every time. If so, Splunk should be able to find that location without difficulty.

Assuming you run Splunk on a Linux server and put all log files in the same directory, you would add a stanza like this to your inputs.conf file.

[monitor:///opt/splunk/etc/apps/sample_app/logs]
index=sample
sourcetype=sendmail

You must always restart Splunk after editing a .conf file to make the changes take effect.

---
If this reply helps you, Karma would be appreciated.
0 Karma

sahoo0233
Path Finder

The third party delivers logs to a directory(say "s_test"), i extract required files and then move the original files to backup.
So i just ipload the duplicate logs, as said by you this should done automatically, even when the third party delivers logs to s_test it doesnt index them, ill try adding it to inputs .conf

So i just need one more help

for example i have a log named "myfiltered"_yyyy_mm_dd" ,

so in inputs.conf ill add like the blow way

[monitor:///directory that needs to be monitored(location of myfiltered"_yyyy_mm_dd)]
index= (what should i give here?)
sourcetype= (what should i give here?)

0 Karma

richgalloway
SplunkTrust
SplunkTrust

Ideally, your stanza name would be [monitor:///s_test], but you could also set it the output directory of your cron job.

Set the sourcetype attribute to the same value you use when you use Splunk Web manually (the "Set source type" dialog box). I prefer to use a test index until I've worked out all the kinks in my config then I set it to something else later.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...