Splunk Search

How to generate timely fake event and compare with real event

egonstep
Path Finder

Hello all, how do I create a timely dummy event (without using "|lookup" external file) to compare with the real generated events to show as a chart.

These logs are generated every 3 hours

_raw event example:

2017-09-04 02:07:00,630 LOG - Code for SERVICE is :1

2017-09-04 05:10:08,450 LOG - Code for SERVICE is :0

2017-09-04 11:05:44,230 LOG - Code for SERVICE is :0

And sometimes the event is not created, as the example shows the event for 08 am didn't occur, but I need to map it as well.

Current search:

base search
| rex  "extracted event_time from _raw"
| eval Status = case(like(_raw, "%SERVICE is :0%"), "Success", like(_raw, "%SERVICE is :%"), "Failed")
| eval _time=strftime(strptime(event_time, "%Y-%m-%d %H:%M:%S"), "%Y/%m/%d %H:%M")
| chart count(Status) over _time by Status

Desired Result:

  _time         Success  Failed   No Event
2017/09/04  02:07    0    1         0
2017/09/04  05:10    0    1         0
2017/09/04  08:00   0     0         1
2017/09/04  11:05    0    1         0

I did use "| timechart" but the method doesn't show the exact event time.

Thanks!

0 Karma
1 Solution

egonstep
Path Finder

Hello All,

So I did some code that returns the desired result.

base search
| rex "retrieve message from _raw "
| eval Time=strftime(_time,"%Y/%m/%d %H:%M:%S") 
| convert timeformat="%Y/%m/%d - %H" ctime(_time) as defaultDate
| table defaultDate Time message
| append
  [| gentimes start=08/01/17:14:00:00 increment=3h
   | convert timeformat="%Y/%m/%d - %H" ctime(starttime) as defaultDate
   | eval message="No Event"
   | table defaultDate message
        | search "base search"
        | tail 1
        | convert timeformat="%Y/%m/%d" ctime(_time) AS c_time
        | fields c_time
        | rename c_time as query] <= defaultDate]
| dedup defaultDate
| eval Time=if(isnull(Time), 'defaultDate', Time)
| eval Status = if(like(message, "%SERVICE is :0%"), "Success", if(like(message,"%SERVICE is :%"), "Failed", "No Event"))
| fields - defaultDate
| table Time Status
| chart count(Status) over Time by Status

The subsearch remove fake events where "earliest time from the _raw events <= defaultDate"

Feel free to improve the query.

Thanks.

View solution in original post

0 Karma

egonstep
Path Finder

Hello All,

So I did some code that returns the desired result.

base search
| rex "retrieve message from _raw "
| eval Time=strftime(_time,"%Y/%m/%d %H:%M:%S") 
| convert timeformat="%Y/%m/%d - %H" ctime(_time) as defaultDate
| table defaultDate Time message
| append
  [| gentimes start=08/01/17:14:00:00 increment=3h
   | convert timeformat="%Y/%m/%d - %H" ctime(starttime) as defaultDate
   | eval message="No Event"
   | table defaultDate message
        | search "base search"
        | tail 1
        | convert timeformat="%Y/%m/%d" ctime(_time) AS c_time
        | fields c_time
        | rename c_time as query] <= defaultDate]
| dedup defaultDate
| eval Time=if(isnull(Time), 'defaultDate', Time)
| eval Status = if(like(message, "%SERVICE is :0%"), "Success", if(like(message,"%SERVICE is :%"), "Failed", "No Event"))
| fields - defaultDate
| table Time Status
| chart count(Status) over Time by Status

The subsearch remove fake events where "earliest time from the _raw events <= defaultDate"

Feel free to improve the query.

Thanks.

0 Karma

egonstep
Path Finder

Thanks, yeah I did use the "| makecontinuous" command, but it doesn't show the exact time for the chart

0 Karma

somesoni2
Revered Legend

Try this

base search
 | rex  "extracted event_time from _raw"
 | eval Status = case(like(_raw, "%SERVICE is :0%"), "Success", like(_raw, "%SERVICE is :%"), "Failed")
 | eval _time=strftime(strptime(event_time, "%Y-%m-%d %H:%M:%S"), "%Y/%m/%d %H:%M")
 | timechart count(Status) by Status | addtotals
 | eval "No Event"=if(Total>0, 0, 1) | fields - Total
0 Karma

egonstep
Path Finder

Thanks for your response, I did try to use your code but"| timechart" doesn't get the event_time date, return the counts for all as 0

0 Karma
Get Updates on the Splunk Community!

Introducing the Splunk Community Dashboard Challenge!

Welcome to Splunk Community Dashboard Challenge! This is your chance to showcase your skills in creating ...

Get the T-shirt to Prove You Survived Splunk University Bootcamp

As if Splunk University, in Las Vegas, in-person, with three days of bootcamps and labs weren’t enough, now ...

Wondering How to Build Resiliency in the Cloud?

IT leaders are choosing Splunk Cloud as an ideal cloud transformation platform to drive business resilience,  ...