Splunk Search

How to artificially insert event based on an existing event value?

Loves-to-Learn Lots

I have a log file that admins can write when they start or stop their server maintenance.
This is then jued to silence email alerts so admins do not get email alerts when they are doing server maintenance.
When the admin will start server maintenance they will write "start of maintenance...." into a specific log file (the source).
When the admin will stop server maintenance they will then write "sen of maintenance...", to that same file.

However, since the email alerts reset themselves after a period (4 hours ) after splunk read the "start of maintenance..." some admins will "forget" to write the "stop of maintenance..." to this file.

I need to have an "start of maintenance..." and corresponding "end of maintenance..." entry.
if I only have a "start of maintenance..." then I must use SPL to insert an event that has "end of maintenance..." and that the _time (or another field that is time-related) has the time of the "start of maintenance..." + 4 hours.
So for example, if "start of maintenance..." _time is 2022/08/05 16:00:00 then I must create a event that has _time (or a time field)) of 2022/08/05 20:00:00.
If there is a corresponding "end of maintenance...." within 4 hours of having a "start of maintenance..." then I should do nothing.

My ultimate goal is to create a dashboard with results filtered by "start of maintenance.." _time and "end of maintenance..." _time, but in order to do this I first have to make sure I have both "start of maintenance..." and "end of maintenance..." _time/Time values.


Labels (3)
0 Karma

<your search> [<your maintenance log search>
  ``` Use mvrange to duplicate start events ```
  | eval range=if(match(event,"start"),mvrange(0,2),null())
  | mvexpand range
  ``` Change duplicated event to 4 hours later ```
  | eval _time=if(range=1,_time+(60*60*4),_time)
  ``` Make duplicated event an end of maintenace event ```
  | eval event=if(range=1,"end of maintenance",event)
  ``` Sort in descending time order (latest first) ```
  | sort 0 -_time
  ``` Set latest to the time of end events ```
  | eval latest=if(match(event,"end"),_time,null())
  ``` Copy latest time to next event ```
  | filldown latest
  ``` Just keep start events (now with time of next end event) ```
  | where match(event,"start")
  ``` Assuming you want just the latest maintenance period ```
  | head 1
  | rename _time as earliest
  ``` Use earliest and latest to filter your main search ```
  | fields earliest latest]
0 Karma


Similar to the idea presented in your other thread, if you can sacrifice performance, you can use transaction.  For the 4-hour limit, it has a maxspan option, e.g.,

| transaction startswith="start of maintenance" endswith="end of maintenance" maxspan=4h
| search closed_txn=0
0 Karma
Get Updates on the Splunk Community!

Introducing Ingest Actions: Filter, Mask, Route, Repeat

WATCH NOW Ingest Actions (IA) is the best new way to easily filter, mask and route your data in Splunk® ...

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...