Monitoring Splunk

How to log new file every day

Glace
Explorer

Hi, ive got a task to do but im complete newbie in splunk. So could you guys help me?

I have to send to splunk logs which have names like this " Crif.mc.Loader.log.2020-11-10" and every day it makes a new file like this.

Inside the log it looks like this:
:55:51,428 INFO LoaderLogger - subj_lien
2020-11-10 23:55:51,428 INFO LoaderLogger - subj_lien_debt
2020-11-10 23:55:51,428 INFO LoaderLogger - subj_lien_deposit

So how can i easily send all these logs every day into splunk? Could you please write it in "splunkfordummies" style? 😄

rnowitzki
Builder

Hi @Glace ,


Some things need to be considered:
- What kind of OS is runing in the host where the logs are located?
- Is there already data being sent from that device(s) to Splunk?
- Which version of Splunk are you running: Version Nr? Cloud or On-Prem?
- Are Splunk and the source VM running in the same network?

But basically, the most common way is to use a Universal Forwarder and monitor the folder where these log files are located.

The time/date should be recognized by Splunk without any further configurations.

Install this:
https://www.splunk.com/en_us/download/universal-forwarder.html

Configure as described here:
https://docs.splunk.com/Documentation/Splunk/8.1.0/Data/Monitorfilesanddirectorieswithinputs.conf


BR
Ralph

 

--
Karma and/or Solution tagging appreciated.

Glace
Explorer

Station where the logs are runs on windows 10.
Forwarder is there already but i dont know how to configure him for this specific event.
Main splunk server is also running on windows 10.
We have our own server here so splunk we running is on-prem with the same network as the forwarder client.

0 Karma

rnowitzki
Builder

Alright, quick and dirty is to add the following stanza to the file 

$SPLUNK_HOME/etc/system/local/inputs.conf

and restart the forwarder

[monitor://C:\path\to\your\logfile\]
disabled = 0
index = <indexname>
sourcetype = <sourcetype>

 
Note:

  • The index needs to exist in Splunk and it should reflect the data that it contains. Maybe there is already an index that fits to the data, if not you would have to create one (another topic). You could check the stanzas that are already in the inputs.conf, or do some searches like index=* | stats count by index, sourcetype (not verbose, and only for a timeframe of a few hours) to get a feeling how the data is setup in your environment.
  • The sourcetype is your choice, but again should be related to the data. 
    • Example: When adding network devices, you could call the index "dell" and the sourcetype "dell:switches". Not sure what kind of logs you are ingesting....
  • Are you the Admin of the Splunk Environment? I would suggest to at least do the Fundamentals I & II courses.
  • If you are not the Admin, then ask them what index and sourcetype you should choose. Also they probably want to create an app for the input instead of adding the stanza to the "main" inputs.conf.

 

Hope this helps.
BR
Ralph

--
Karma and/or Solution tagging appreciated.
Get Updates on the Splunk Community!

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...

Adoption of RUM and APM at Splunk

    Unleash the power of Splunk Observability   Watch Now In this can't miss Tech Talk! The Splunk Growth ...