Splunk Search

calculate percentage of multiple columns and then create a chart

mmdacutanan
Explorer

I have a query (pasted below) that counts occurrence of different strings within the same field called Variable10. I then count the occurrence of that string found in Variable10 and I give it a new name:

index=uc sourcetype=uc_tcd DigitsDialed=12345 Code!=000000| where not like (Variable7,"%PRE%")|dedup Variable2| timechart span=15m
count(eval(like(Variable10,"%|T%"))) as T_Count,
count(eval(like(Variable10,"%|M%"))) as M_Count,
count(eval(like(Variable10,"%|E%"))) as E_Count,
count(eval(like(Variable10,"%|G%"))) as G_Count,
count(eval(like(Variable10,"%|P%"))) as P_Count,
count(eval(like(Variable10,"%|L%"))) as L_Count,
count(eval(like(Variable10,"%346%") OR like(Variable10,"%347%") )) as U_Count,
count(eval(not like(Variable10,"%346%") OR not like(Variable10,"%347%") OR not like(Variable10,"%|T%") OR not like(Variable10,"%|M%") OR not like(Variable10,"%|E%") OR not like(Variable10,"%|G%") OR
not like(Variable10,"%|P%") OR not like(Variable10,"%|L%"))) as X_Count
| addtotals col=f

The output looks like this:

_time T_Count M_Count E_Count G_Count P_Count L_Count U_Count X_Count Total
2018-05-30T00:00:00.000-0700 0 0 13 19 16 27 0 287 362
2018-05-30T00:15:00.000-0700 0 0 8 9 9 3 0 228 257
2018-05-30T00:30:00.000-0700 0 0 6 4 17 1 0 217 245
2018-05-30T00:45:00.000-0700 0 0 8 7 28 0 0 186 229
2018-05-30T01:00:00.000-0700 0 0 3 6 21 0 0 171 201

What I need now is to be able to calculate the percentage of each *_Count column so that it looks something like this:

_time T_Count T% M_Count M% E_Count E% G_Count G% P_Count P% L_Count L% U_Count U% X_Count X% Total
2018-05-30T00:00:00.000-0700 0 0 0 0 13 3.59 19 5.25 16 4.42 27 7.46 0 0 287 79.28 362
2018-05-30T00:15:00.000-0700 0 0 0 0 8 3.11 9 3.5 9 3.5 3 1.17 0 0 228 88.72 257
2018-05-30T00:30:00.000-0700 0 0 0 0 6 2.45 4 1.63 17 6.94 1 0.41 0 0 217 88.57 245
2018-05-30T00:45:00.000-0700 0 0 0 0 8 3.49 7 3.06 28 12.23 0 0 0 0 186 81.22 229
2018-05-30T01:00:00.000-0700 0 0 0 0 3 1.49 6 2.99 21 10.45 0 0 0 0 171 85.07 201

And then finally I'd like to be able to do a line chart of just the percentages, not the count, on a 15min interval.

Any feedback/suggestion will help.

Thanks in advance!

0 Karma
1 Solution

niketn
Legend

@mmdacutanan, please pipe the following commands to your current search to get the required output. The foreach command will perform template eval function on the required fields:

<YourCurrentSearchForCountAndTotal>
|  foreach "*_Count"
    [eval "<<MATCHSTR>>_Perc%"=round(('<<FIELD>>'/Total)*100,1)]
|  table _time "*_Perc%" Total

Following is a run anywhere search based on the sample data provided. The commands till from | makeresults till | table _time * Total generate dummy data as per first table in the question:

| makeresults
| eval data="2018-05-30T00:00:00.000-0700 0 0 13 19 16 27 0 287 362;2018-05-30T00:15:00.000-0700 0 0 8 9 9 3 0 228 257;2018-05-30T00:30:00.000-0700 0 0 6 4 17 1 0 217 245;2018-05-30T00:45:00.000-0700 0 0 8 7 28 0 0 186 229;2018-05-30T01:00:00.000-0700 0 0 3 6 21 0 0 171 201"
|  makemv data delim=";"
|  mvexpand data
|  makemv data delim=" "
|  eval _time=strptime(mvindex(data,0),"%Y-%m-%dT%H:%M:%S.%3N%z"), T_Count=mvindex(data,1), M_Count=mvindex(data,2), E_Count=mvindex(data,3), G_Count=mvindex(data,4), P_Count=mvindex(data,5), L_Count=mvindex(data,6), U_Count=mvindex(data,7), X_Count=mvindex(data,8), Total=mvindex(data,9)
|  fields - data
|  table _time * Total
|  foreach "*_Count"
    [eval "<<MATCHSTR>>_Perc%"=round(('<<FIELD>>'/Total)*100,1)]
|  table _time "*_Perc%" Total
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

View solution in original post

niketn
Legend

@mmdacutanan, please pipe the following commands to your current search to get the required output. The foreach command will perform template eval function on the required fields:

<YourCurrentSearchForCountAndTotal>
|  foreach "*_Count"
    [eval "<<MATCHSTR>>_Perc%"=round(('<<FIELD>>'/Total)*100,1)]
|  table _time "*_Perc%" Total

Following is a run anywhere search based on the sample data provided. The commands till from | makeresults till | table _time * Total generate dummy data as per first table in the question:

| makeresults
| eval data="2018-05-30T00:00:00.000-0700 0 0 13 19 16 27 0 287 362;2018-05-30T00:15:00.000-0700 0 0 8 9 9 3 0 228 257;2018-05-30T00:30:00.000-0700 0 0 6 4 17 1 0 217 245;2018-05-30T00:45:00.000-0700 0 0 8 7 28 0 0 186 229;2018-05-30T01:00:00.000-0700 0 0 3 6 21 0 0 171 201"
|  makemv data delim=";"
|  mvexpand data
|  makemv data delim=" "
|  eval _time=strptime(mvindex(data,0),"%Y-%m-%dT%H:%M:%S.%3N%z"), T_Count=mvindex(data,1), M_Count=mvindex(data,2), E_Count=mvindex(data,3), G_Count=mvindex(data,4), P_Count=mvindex(data,5), L_Count=mvindex(data,6), U_Count=mvindex(data,7), X_Count=mvindex(data,8), Total=mvindex(data,9)
|  fields - data
|  table _time * Total
|  foreach "*_Count"
    [eval "<<MATCHSTR>>_Perc%"=round(('<<FIELD>>'/Total)*100,1)]
|  table _time "*_Perc%" Total
____________________________________________
| makeresults | eval message= "Happy Splunking!!!"

mmdacutanan
Explorer

Thank you so much niketnilay! This is giving me what I need. I was playing around with foreach function before but I was not doing it right. Now i know 😃 Really appreciate your help!

niketn
Legend

I am glad you got it to work! foreach takes some time to get to specially if you code foreach differently in your day to day code 😉

____________________________________________
| makeresults | eval message= "Happy Splunking!!!"
0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...