Splunk Search

See the results of Splunk alerts in a search.

Aresndiz
Explorer

I'm trying to optimize the alerts since I'm having issues. Where I work, it's somewhat slow to solve the problem (1 to 3 days) when the alert is triggered. This causes the alert to constantly trigger in the given time. I can't use Throttle since my alerts do not depend on a single host or event. For example:

index=os_pci_windowsatom host IN (HostP1 HostP2 HostP3 HostP4) source=cnt_mx_pci_sql_*_status_db 


|dedup 1 host state_desc
| streamstats  values(state_desc) as State by host

| eval Estado=case(
State!="ONLINE", "Critico",
State="ONLINE", "Safe"
)
| table Estado host  State _time
| where Estado="Critico"

When the status of a Host changes to critical, it triggers the alert. For this reason, I cannot use Throttle because in the time span that this alert is silenced, one of the hosts may trigger, omitting the entire alert completely.

My idea is to create logic based on the results of the last triggered alert and compare them with the current alert where if the host and status are the same, it remains unchanged. However, if the host and status are different from the previous one triggered, it should be triggered. splcapver.pngCaptura de pantalla 2024-12-20 115513.pngI thought about using the data where it's stored, but I don't know how to search for this information, does anyone have an idea? e

Any comment is greatly appreciated.

Labels (1)
0 Karma
1 Solution

Aresndiz
Explorer

My solution was to configure another alert to send a lookup with a status of the first alert. I created a logic rule where if the first alert has a new result different from the second alert, this one would be triggered.

| eval Estado=case(
State="Offline", "Critico",
State="EnSplunk", "Safe")
| join type=left host [
| inputlookup lkp_mx_mr_pci_diponibles_results.csv
| eval host1=host
| eval Estado1=Estado
| table host host1 Estado1 Servicio
]
| eval Estado2=Estado
| eval host2=host
| eval case=if(host1=host2 AND Estado1=Estado2, "true", "false")
| table Estado host SO Servicio Fecha host1 host2 Estado1 Estado2 case
| sort Estado
| where Estado="Critico" AND case="false"
| fields - host1 host2 case Estado1 Estado2

View solution in original post

0 Karma

Aresndiz
Explorer

My solution was to configure another alert to send a lookup with a status of the first alert. I created a logic rule where if the first alert has a new result different from the second alert, this one would be triggered.

| eval Estado=case(
State="Offline", "Critico",
State="EnSplunk", "Safe")
| join type=left host [
| inputlookup lkp_mx_mr_pci_diponibles_results.csv
| eval host1=host
| eval Estado1=Estado
| table host host1 Estado1 Servicio
]
| eval Estado2=Estado
| eval host2=host
| eval case=if(host1=host2 AND Estado1=Estado2, "true", "false")
| table Estado host SO Servicio Fecha host1 host2 Estado1 Estado2 case
| sort Estado
| where Estado="Critico" AND case="false"
| fields - host1 host2 case Estado1 Estado2

0 Karma

isoutamo
SplunkTrust
SplunkTrust
0 Karma

yuanliu
SplunkTrust
SplunkTrust

Let me first try to understand the problem: You want to find servers whose end state is offline, but whose immediate previous reported state is not offline, i.e., those whose state newly becomes offline.  Is this correct?  In other words, given these mock events

_timehoststate_desc
2024-12-20 18:00host1not online
2024-12-20 16:00host2not online
2024-12-20 14:00host3ONLINE
2024-12-20 12:00host4not online
2024-12-20 10:00host0not online
2024-12-20 08:00host1ONLINE
2024-12-20 06:00host2not online
2024-12-20 04:00host3not online
2024-12-20 02:00host4ONLINE
2024-12-20 00:00host0not online
2024-12-19 22:00host1not online
2024-12-19 20:00host2ONLINE
2024-12-19 18:00host3not online
2024-12-19 16:00host4not online
2024-12-19 14:00host0ONLINE
2024-12-19 12:00host1not online
2024-12-19 10:00host2not online
2024-12-19 08:00host3ONLINE
2024-12-19 06:00host4not online
2024-12-19 04:00host0not online
2024-12-19 02:00host1ONLINE
2024-12-19 00:00host2not online
2024-12-18 22:00host3not online
2024-12-18 20:00host4ONLINE
2024-12-18 18:00host0not online
2024-12-18 16:00host1not online
2024-12-18 14:00host2ONLINE
2024-12-18 12:00host3not online
2024-12-18 10:00host4not online
2024-12-18 08:00host0ONLINE
2024-12-18 06:00host1not online
2024-12-18 04:00host2not online
2024-12-18 02:00host3ONLINE
2024-12-18 00:00host4not online
2024-12-17 22:00host0not online

You want alert on host1 and host4 only.

To do this with streamstats, you will need to sort events this way and that.  I usually consider them costs. (And I am quite fuzzy in streamstats:-)  So, I consider this one of few good uses of transaction.  Something like

 

index=os_pci_windowsatom host IN (HostP1 HostP2 HostP3 HostP4) source=cnt_mx_pci_sql_*_status_db
| transaction host endswith=state_desc=ONLINE keepevicted=true
| search eventcount = 1 state_desc != ONLINE

 

Here is an emulation of the mock data for you to play with and compare with real data.

 

| makeresults count=35
| streamstats count as state_desc
| eval _time = relative_time(_time - state_desc * 7200, "-0h@h")
| eval host = "host" . state_desc % 5, state_desc = if(state_desc % 3 > 0, "not online", "ONLINE")
``` the above emulates
index=os_pci_windowsatom host IN (HostP1 HostP2 HostP3 HostP4) source=cnt_mx_pci_sql_*_status_db 
```

 

Output from the search is

_timeclosed_txndurationeventcountfield_match_sumhostlinecountstate_desc
2024-12-20 18:000011host11not online
2024-12-20 12:000011host41not online

The rest of your search is simply manipulation of display string.

Tags (1)
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...