Splunk Search

How to get stats from log and display src,dest from another log having uniqueId is common

Explorer

I have a set of logs...
log1 is task startingtime log having taskbegin ,uniqueID, src ,dest and log2 is task endTime log having only taskEnd , uniqueId, totalExecTime
Now I need accurate stats for avg,sum with src,dest ( per hour one startlog, endlog, in a day hardly 23/24 logs).

Set1: having src=abc, dest=xyz
Set1:log1: [com.pqr.MyOwnClass] | taskbegin uniqueID=34567, src=abc dest=xyz
Set1:log2: [com.pqr.MyOwnClass] |taskEnd uniqueID=34567, totalExecTime=18765
Set1:log3: [com.pqr.MyOwnClass] | taskbegin uniqueID=41523, src=abc dest=xyz
Set1:log4: [com.pqr.MyOwnClass] |taskEnd uniqueID=41523, totalExecTime=22453
Set2: having src=mnb, dest=lop
set2:log5: [com.pqr.MyOwnClass] | taskbegin uniqueID=64563, src=mnb dest=lop
set2:log6: [com.pqr.MyOwnClass] |taskEnd uniqueID=64563, totalExecTime=54869
set2:log7: [com.pqr.MyOwnClass] | taskbegin uniqueID=79561, src=mnb dest=lop
set2:log8: [com.pqr.MyOwnClass] |taskEnd uniqueID=79561, totalExecTime=34211

Now I need accurate stats for avg,sum for totalExecTime ( per hour one startlog, endlog, in a day hardly 23/24 logs).

I have tried with below search, but not sure it is giving accurate results:

source=xxxxxxx "com.pqr.MyOwnClass" (taskbegin OR taskEnd) | eval totalExecMin=round(totalExecTime/60000,1) | selfjoin uniqueID | search runTaskEnd | stats count avg(totalExecMin) sum(totalExecMin) by src dest

Please help me with this.

0 Karma

Esteemed Legend

You need to create a sessionID and use that like this:

| makeresults 
| eval raw="[com.pqr.MyOwnClass] | taskbegin uniqueID=34567, src=abc dest=xyz
[com.pqr.MyOwnClass] |taskEnd uniqueID=34567, totalExecTime=18765
[com.pqr.MyOwnClass] | taskbegin uniqueID=41523, src=abc dest=xyz
[com.pqr.MyOwnClass] |taskEnd uniqueID=41523, totalExecTime=22453
[com.pqr.MyOwnClass] | taskbegin uniqueID=64563, src=mnb dest=lop
[com.pqr.MyOwnClass] |taskEnd uniqueID=64563, totalExecTime=54869
[com.pqr.MyOwnClass] | taskbegin uniqueID=79561, src=mnb dest=lop
[com.pqr.MyOwnClass] |taskEnd uniqueID=79561, totalExecTime=34211" 
| makemv delim="
" raw 
| mvexpand raw 
| rename raw AS _raw 
| kv 
| reverse 

| rename COMMENT AS "Everything above generates sample events; everything below is your solution"

| eval totalExecMin=round(totalExecTime/60000,1) 
| streamstats count(eval(match(_raw, "taskEnd"))) AS sessionID BY uniqueID 
| stats values(*) AS * BY uniqueID sessionID 
| stats count avg(totalExecMin) sum(totalExecMin) by src dest
0 Karma

Builder
source=xxxxxxx "com.pqr.MyOwnClass" (taskbegin OR taskEnd) 
| eval totalExecMin=round(totalExecTime/60000,1) 
| stats sum(totalExecMin) as totalExecMin, values(src) as src, values(dst) as dst, min(_time) as start, max(_time) as End , by uniqueID
| stats count avg(totalExecMin) sum(totalExecMin) by src dest

The first stats joins the two events based on the ID, the second run stats across joined events.
Or instead of the second stats line you could then charrt of timechart based on the start or end times

0 Karma

Explorer

the above query aggregating start time and endtime from all the events, i dont need aggregation by uniqueID. The uniqueID will have several startTIme (taskbegin ) and subsequent endTime (taskEnd) logs
if 1 uniqueID may have several set (1 per hour) like below

I need sum and avg of all sets (sum of set1, avg set1 and sum of set1, avg set1 and so on ) for every uniqueID

Set1: having src=abc, dest=xyz
Set1:log1: [com.pqr.MyOwnClass] | taskbegin uniqueID=34567, src=abc dest=xyz
Set1:log2: [com.pqr.MyOwnClass] |taskEnd uniqueID=34567, totalExecTime=18765

Set2:log3: [com.pqr.MyOwnClass] | taskbegin uniqueID=34567, src=abc dest=xyz
Set2:log4: [com.pqr.MyOwnClass] |taskEnd uniqueID=34567, totalExecTime=22453

Set7:log13: [com.pqr.MyOwnClass] | taskbegin uniqueID=34567, src=abc dest=xyz
Set7:log14: [com.pqr.MyOwnClass] |taskEnd uniqueID=34567, totalExecTime=23133

Set3/4: having src=mnb, dest=lop
set3:log5: [com.pqr.MyOwnClass] | taskbegin uniqueID=64563, src=mnb dest=lop
set3:log6: [com.pqr.MyOwnClass] |taskEnd uniqueID=64563, totalExecTime=54869

set4:log7: [com.pqr.MyOwnClass] | taskbegin uniqueID=64563, src=mnb dest=lop
set4:log8: [com.pqr.MyOwnClass] |taskEnd uniqueID=64563, totalExecTime=34211

Set5/6: having src=wdf, dest=jcd
set5:log9: [com.pqr.MyOwnClass] | taskbegin uniqueID=76687, src=wdf dest=jcd
set5:log10: [com.pqr.MyOwnClass] |taskEnd uniqueID=76687, totalExecTime=56567

set6:log11: [com.pqr.MyOwnClass] | taskbegin uniqueID=76687, src=wdf dest=jcd
set6:log12: [com.pqr.MyOwnClass] |taskEnd uniqueID=76687, totalExecTime=23212

0 Karma

Builder

If the uniqueIDs are only unique per src/dest, and since you are only calculating your stats on the totalExecTime, which only occurs when the session end. I recommend:

source=xxxxxxx "com.pqr.MyOwnClass" (taskbegin OR taskEnd)
| stats sum(totalExecMin) as sumTotalExecMin, avg(totalExecMin) as avgTotalExecMin, values(src) as src, values(dst) as dst, by uniqueID
| stats count avg(avgTotalExecMin) sum(sumTotalExecMin) by src dest
| eval sumTotalExecMin=round(sumTotalExecTime/60000,1), avgTotalExecMin=round(avgTotalExecMin/60000,1)

This will calculate the sum and avg for all sessions tied to a uniqueID.
You then aggregate those based on the src and dst.

0 Karma