Splunk Search

How to edit and optimize my search to calculate the average and format top ten results?

atanasmitev
Path Finder

I have a working search that calculates total hits, avg(per_hour), avg(per_minute), top10 IPs with count and value. Now I have a couple of problems that need optimization/or fixing :

  source="mysource" "POST url" evttype=INFO | bucket _time span=24h | stats dc(clientIP) as Unique_IP_Hits, count as Total_Hits by _time  |  eval avg_perhour=(Total_Hits/24) | eval avg_perminute=(avg_perhour/60) | table _time, Total_Hits, Unique_IP_Hits, avg_perhour, avg_perminute| appendcols [searchsource="mysource" "POST url" evttype=INFO | bucket _time span=24h | top clientIP limit=5 by _time| sort -count| eval pair=clientIP." (".count.")" | stats values(pair) as TOP10_IP]

This returns :

Time     Total_Hits         Unique_IP_Hits      perhour       perminute   Top10

Now here are my concerns where I need help:
1) The avg(per_time) is calculated statically , i.e (total/24), so if the span is less than 24 it wont work. There is a | timechart span=X h per_hour(total) command that I can't get to work with this search
2) The top10 spans the upper right cell, I need to either merge all rightmost cells, or find a solution to get (top10 per day)
3) Optional - if we get to work the 1) and 2), is there a more short/nice way of rewriting the search.

I know it's a lot of work and I don't expect prompt response, but will be glad if you could.

Thanks in advance,

1 Solution

somesoni2
Revered Legend

Give this a try

 source="mysource" "POST url" evttype=INFO 
 | eventstats max(_time) as max min(_time) as min by _time | eval durationSec=max-min
 | bucket _time span=1d | stats count , first(durationSec) as durationSec by _time, clientIP 
 | eventstats sum(count) as Total_Hits count as Unique_IP_Hits by _time
 | eval pair=clientIP." (".count.")"
 | sort 0 _time - count    | streamstats count as sno by _time     | where sno<6
 | stats first(Total_Hits) as Total_Hists, first(Unique_IP_Hits) as Unique_IP_Hits
 values(pair) as TOP10_IP first(durationSec) as durationSec by _time
 | eval avg_perhour=(Total_Hits/(floor(durationSec/3600))) 
 | eval avg_perminute=(Total_Hits/(floor(durationSec/3600*60))) 
 | table _time, Total_Hits, Unique_IP_Hits, avg_perhour, avg_perminute,TOP10_IP

View solution in original post

somesoni2
Revered Legend

Give this a try

 source="mysource" "POST url" evttype=INFO 
 | eventstats max(_time) as max min(_time) as min by _time | eval durationSec=max-min
 | bucket _time span=1d | stats count , first(durationSec) as durationSec by _time, clientIP 
 | eventstats sum(count) as Total_Hits count as Unique_IP_Hits by _time
 | eval pair=clientIP." (".count.")"
 | sort 0 _time - count    | streamstats count as sno by _time     | where sno<6
 | stats first(Total_Hits) as Total_Hists, first(Unique_IP_Hits) as Unique_IP_Hits
 values(pair) as TOP10_IP first(durationSec) as durationSec by _time
 | eval avg_perhour=(Total_Hits/(floor(durationSec/3600))) 
 | eval avg_perminute=(Total_Hits/(floor(durationSec/3600*60))) 
 | table _time, Total_Hits, Unique_IP_Hits, avg_perhour, avg_perminute,TOP10_IP

atanasmitev
Path Finder

It would work with some fiddling - max and min are very close , so max-min=0 However, it gives leads where to check

I have already re-done the search using the old ideas, but will use the version above as a know how for future cases:

source="mysource" "url" evttype=INFO | bucket _time span=24h | stats dc(clientip) as uniq, count as total by _time | timechart span=24h per_day(total) as TotalHits, per_day(uniq) as UniqueHits, per_hour(total) as PerHour per_minute(total) as PerMinute | appendcols [search source="mysource" "url" evvtype=INFO | bucket _time span=24h | top clientip limit="N" by _time| sort -count| eval pair=clientip." (".count.")" | stats values(pair) as TOP"N"_IP by _time]

It still needs some small fixes but looks a bit neater than the previous and the per_hour/minute is calculated automatically.
Same trouble with span<24h, but I will check that 🙂

0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...