Alerting

Compare two Splunk Alert search with same time span

lamnguyentt1
Explorer

Dear Professor,

I have two alert search like this

1. Search 1:

index="abc" sourcetype="abc" service.name=financing request.method="POST" request.uri="*/applications" response.status="200"
|timechart span=2m count as applicaton_today
|eval mytime=strftime(_time,"%Y-%m-%dT%H:%M")
|eval yesterday_time=strftime(_time,"%H:%M")
|fields _time,yesterday_time,applicaton_today

And here is output

1.png2. Search 2:

index="xyz" sourcetype="xyz" "Application * sent to xyz success"
|timechart span=2m count as omni_today
|eval mytime=strftime(_time,"%Y-%m-%dT%H:%M")
|eval yesterday_time=strftime(_time,"%H:%M")
|fields _time,yesterday_time,omni_today

And here is output

2.png

 3. I try to combine two search like this then calculate spike.

index="abc" sourcetype="abc" service.name=financing request.method="POST" request.uri="*/applications" response.status="200"
|timechart span=2m count as app_today
|eval mytime=strftime(_time,"%Y-%m-%dT%H:%M")
|eval yesterday_time=strftime(_time,"%H:%M")
| append [search index="xyz" sourcetype="xyz" "Application * sent to xyz"
| timechart span=2m count as omni_today]
|fields _time,yesterday_time,app_today,omni_today
|eval spike=if(omni_today < app_today AND _time <= now() - 3*60 AND _time >= relative_time(now(),"@d") + 7.5*3600, 1, 0)

Here is output

3.png

 But it shows two time span (like image). 

How can I combine two search with only time span like this.

4.PNG

 

Thank you for your help.

 

 

Labels (3)
0 Karma
1 Solution

PickleRick
SplunkTrust
SplunkTrust

Well. You simply asked splunk to append results of one search to results of another search. So splunk did it - took rows of results from one search and "glued" them to the end of another search.

You could further transform combined results but it's better to start off without the append in the first place. (subsearches have their own limits and can trigger some tricky behaviour).

Since you're doing mostly same thing with two sets of result data, you can just get all your events amd then calculate separate stats for both kinds of events.

(index="abc" sourcetype="abc" service.name=financing request.method="POST" request.uri="*/applications" response.status="200") OR ( index="xyz" sourcetype="xyz" "Application * sent to xyz")
|timechart span=2m count(eval(sourcetype="abc")) as app_today count(eval(sourcetype="xyz")) as omni_today
|eval mytime=strftime(_time,"%Y-%m-%dT%H:%M")
|eval yesterday_time=strftime(_time,"%H:%M")
|fields _time,yesterday_time,app_today,omni_today
|eval spike=if(omni_today < app_today AND _time <= now() - 3*60 AND _time >= relative_time(now(),"@d") + 7.5*3600, 1, 0)

 

View solution in original post

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Well. You simply asked splunk to append results of one search to results of another search. So splunk did it - took rows of results from one search and "glued" them to the end of another search.

You could further transform combined results but it's better to start off without the append in the first place. (subsearches have their own limits and can trigger some tricky behaviour).

Since you're doing mostly same thing with two sets of result data, you can just get all your events amd then calculate separate stats for both kinds of events.

(index="abc" sourcetype="abc" service.name=financing request.method="POST" request.uri="*/applications" response.status="200") OR ( index="xyz" sourcetype="xyz" "Application * sent to xyz")
|timechart span=2m count(eval(sourcetype="abc")) as app_today count(eval(sourcetype="xyz")) as omni_today
|eval mytime=strftime(_time,"%Y-%m-%dT%H:%M")
|eval yesterday_time=strftime(_time,"%H:%M")
|fields _time,yesterday_time,app_today,omni_today
|eval spike=if(omni_today < app_today AND _time <= now() - 3*60 AND _time >= relative_time(now(),"@d") + 7.5*3600, 1, 0)

 

0 Karma

lamnguyentt1
Explorer

Thank you for your help.

With my way, it's really easy by change "append" to "appendcols". 

0 Karma
Get Updates on the Splunk Community!

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...

Cloud Platform & Enterprise: Classic Dashboard Export Feature Deprecation

As of Splunk Cloud Platform 9.3.2408 and Splunk Enterprise 9.4, classic dashboard export features are now ...