Splunk Search

How to use eval to find the usecase?

AL3Z
Builder

Hi,
Could any one able to write the query for the use case if user triggers both alerts (alert_name="*pdm*" AND alert_name="*encrypted*") in between 2 hours

another use case is alertname!="*pdm*" if user triggers other than pdm alert in between 2 hours 

need a query for above use cases 
thanks...

Labels (1)
0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

You could start by counting the alerts by user over a two hour period

| stats count by user alert_name

 Then classify the alerts

| eval alert_type=case(like(alert_name,"%pdm%"), "pdm", like(alert_name,"%encrypted%"), "encrypted", 1==1, "notpdm")
| chart count by user alert_type
| where (pdm > 0 AND encrypted > 0) OR notpdm > 1

AL3Z
Builder

@ITWhisperer 

Hi, Could you please make search seperate as per the use case instead of all in one search

1.Use case alert_name= "*pdm*" AND alert_name="*encrypted*"

Both alerts in between 2 hours.

2. Use case alertname!="*pdm*"

Between 2 hours of period.

Thanks

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| eval alert_type=case(like(alert_name,"%pdm%"), "pdm", like(alert_name,"%encrypted%"), "encrypted", 1==1, "notpdm")
| chart count by user alert_type
| where pdm > 0 AND encrypted > 0
| eval alert_type=case(like(alert_name,"%pdm%"), "pdm", like(alert_name,"%encrypted%"), "encrypted", 1==1, "notpdm")
| chart count by user alert_type
| where notpdm > 1

AL3Z
Builder

@ITWhisperer 

Second use case is like alert_ name other than pdm, dnt mention encrypted,we are not interested in encrypted alert_name in second search. 

Alert_ name!= "*pdm*"

Here we are using *pdm* coz to find the similar word pdm in alert_name 

Pls remove encrypted from query it is different alert type ,

Only need alert_name !="*pdm*"

 

Thanks

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

It is a simple change - you could have done it yourself?

| eval alert_type=case(like(alert_name,"%pdm%"), "pdm", 1==1, "notpdm")
| chart count by user alert_type
| where notpdm > 1

AL3Z
Builder

@ITWhisperer ,

Hi,

In this search you didn't mention any time interval like the period of two hours 

When the user triggers the non pdm alerts in between 2 hours of  time 

0 Karma

AL3Z
Builder

@ITWhisperer 
Could you pls check why this query is  not listing all the stats  fields in the output
index=es sourcetype=alert (alert_name!="*PDM*")
| stats earliest(_time) as incident_time,
values(severity) as severity,
values(action) as action,
values(file_type) as file_type,
values(exposure) as exposure,
values(url) as url,
values(device) as device
by user,alert_name
| eval alert_type=case(like(alert_name,"%pdm%"), "pdm", 1==1, "notpdm")
| chart count by user alert_type
| where notpdm > 1

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Try it this way

index=es sourcetype=alert NOT (alert_name="*PDM*")

AL3Z
Builder

@ITWhisperer 

Why still it is not listing out the stats fields mentioned ? What could be the reason

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

The chart command has two dimensions - in your case, these are user, and alert_type, against which there is a count.

If you want more fields, don't use chart.

What are you attempting to show?

0 Karma

AL3Z
Builder

@ITWhisperer ,

Other fields as well like dlprule,filetype,incidentime

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Don't use chart!

Try with eventstats

| eventstats count by user alert_type
| where alert_type=="notpdm" AND count > 1
0 Karma

AL3Z
Builder

@ITWhisperer ,

Pls could you make the search in between 2 hours more than 3 times if user triggers alert other than pdm

Thanks 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Change the timeframe of the search to be the time period you want.

0 Karma
Get Updates on the Splunk Community!

Fun with Regular Expression - multiples of nine

Fun with Regular Expression - multiples of nineThis challenge was first posted on Slack #regex channel ...

[Live Demo] Watch SOC transformation in action with the reimagined Splunk Enterprise ...

Overwhelmed SOC? Splunk ES Has Your Back Tool sprawl, alert fatigue, and endless context switching are making ...

What’s New & Next in Splunk SOAR

Security teams today are dealing with more alerts, more tools, and more pressure than ever.  Join us on ...