Archive

Email Notification with Error Logfiles

Explorer

Hey,
i have installed splunk and i hope i can do these tasks with it.
(Im have never used splunk before)

I have some Logfiles on a machine in the network.
I will analyze these logfiles with splunk on another machine and if there are some error messages it should send me an email and inform me.

I have installed the Splunk forwarder on the machine with the logfiles.
Now. Whats with the email service? Is it possible?

And if yes how?

Notice: Im a beginner (noob) with splunk.
Sorry for bad english.

Thanks 🙂

Tags (1)
1 Solution

Builder

You can use scripted inputs to do all kinds of weird things to get data into Splunk. The better choice is to use universal forwarder to forward your data or you can transfer the logs to local splunk instance or put it on a network sharing drive via samba/nfs etc.

So in order to analyze the logs you need first to index them into splunk, the next steps is to used saved search and alert with an notification email. An alert could be triggered when a criteria is met.

You can configure savedsearch.conf easily or just use the management interface.
An example of editing the savedsearch.conf file to sent an email notification:

[Database Response Time Average]
action.email = 1
action.email.format = csv
action.email.sendresults = 1
action.email.subject = Splunk DB response time on online is very high > 200 sec.: $name$
action.email.to = royimad@royimad.net
action.script = 1
action.script.filename = actions.sh
alert.digest_mode = True
alert.expires = 12h
alert.severity = 5
alert.suppress = 1
alert.suppress.fields = average
alert.suppress.period = 1d
auto_summarize.dispatch.earliest_time = -1d@h
counttype = number of events
cron_schedule = */55 * * * *
enableSched = 1
quantity = 0
relation = greater than
search = host="online.wavemark.net" (LOGTYPE="DB") | stats avg(LOGDURATION) AS average | where average > 2000

View solution in original post

Builder

You can use scripted inputs to do all kinds of weird things to get data into Splunk. The better choice is to use universal forwarder to forward your data or you can transfer the logs to local splunk instance or put it on a network sharing drive via samba/nfs etc.

So in order to analyze the logs you need first to index them into splunk, the next steps is to used saved search and alert with an notification email. An alert could be triggered when a criteria is met.

You can configure savedsearch.conf easily or just use the management interface.
An example of editing the savedsearch.conf file to sent an email notification:

[Database Response Time Average]
action.email = 1
action.email.format = csv
action.email.sendresults = 1
action.email.subject = Splunk DB response time on online is very high > 200 sec.: $name$
action.email.to = royimad@royimad.net
action.script = 1
action.script.filename = actions.sh
alert.digest_mode = True
alert.expires = 12h
alert.severity = 5
alert.suppress = 1
alert.suppress.fields = average
alert.suppress.period = 1d
auto_summarize.dispatch.earliest_time = -1d@h
counttype = number of events
cron_schedule = */55 * * * *
enableSched = 1
quantity = 0
relation = greater than
search = host="online.wavemark.net" (LOGTYPE="DB") | stats avg(LOGDURATION) AS average | where average > 2000

View solution in original post

Builder

Sure , Please email me to: royimad@gmail.com

0 Karma

Explorer

Thanks for the answer! I have here some problems with the forwarder... Can you give me your email or emailing me? Mine is: yannik.heinz@itac.de

0 Karma
State of Splunk Careers

Access the Splunk Careers Report to see real data that shows how Splunk mastery increases your value and job satisfaction.

Find out what your skills are worth!