Splunk Search

Making a table with Eval and Top results

GabrielSantos
Engager

Hello!

I'm trying to separate the latency results with Eval by dividing in 3 categories and then showing the percentage using the Top command.

This was working for the beginning of the project but now I need to separated the results by hour instead of the whole day and including the Table command and using the fields from Eval is not working.

 

Here's my search

| eval tempo= case(
'netPerf.netOriginLatency'<2000, "Under 2s",
'netPerf.netOriginLatency'>2000 AND 'netPerf.netOriginLatency'<3000, "Between 2s and 3s",
'netPerf.netOriginLatency'>3000, "Above 3s"
)
| top 0 tempo

Latency count percent

Under 2s 74209 86.5 %
Between 2s and 3s 10736 12.5 %
Above 3s 803 0.9 %

 

Ideal scenario would be something like this:

_time Under 2s Between 2s and 3s Above 3s
06/07/2024 00:00 97.3 % 2.3 % 2.3 %
06/07/2024 01:00 96.3 % 2.7 % 1.0 %

 

Appreciate the time and help!

Labels (2)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Instead of your top command do

| timechart span=1h count by tempo

to have counts of each tempo.

Now the thing is to count the percentage. In order to do so you need to find how many events you have in total for each hour

| eventstats sum(count) as total by _time

(remember that _time is binned to full hour due to timechart).

So now you can just calculate your percentage

| eval percentage=count/total

GabrielSantos
Engager

Thanks for the reply!

 

I tried using the second eval to automatically calculate the percentage but it doesn't seem to be working, it is only showing the count for each field on the first eval command.

| eval tempo= case(
'netPerf.netOriginLatency'<2000, "Under 2s",
'netPerf.netOriginLatency'>2000 AND 'netPerf.netOriginLatency'<3000, "Between 2s and 3s",
'netPerf.netOriginLatency'>3000, "Above 3s")
| timechart span=1h count by tempo usenull=false
| eventstats sum(count) as total by _time
| eval percentage=count/total

_time Above 3s Between 2s and 3s Under 2s
08/07/2024 00:00 109 588 19307
08/07/2024 01:00 113 530 14900
08/07/2024 02:00 6 128 5450
08/07/2024 03:00 22 122 2847

 

But this already helps, I can extract the results in csv and calculate the percentage on excel.

 

Appreciate the quick reply!

0 Karma

yuanliu
SplunkTrust
SplunkTrust

eventstats is not the answer; addtotals is.

| eval tempo= case(
'netPerf.netOriginLatency'<2000, "Under 2s",
'netPerf.netOriginLatency'>2000 AND 'netPerf.netOriginLatency'<3000, "Between 2s and 3s",
'netPerf.netOriginLatency'>3000, "Above 3s")
| timechart span=1h count by tempo usenull=false
| addtotals fieldname=_total
| foreach *
    [eval <<FIELD>> = if(isnull(_total), null(), round('<<FIELD>>'/_total * 100, 1) . " %")]

 

Tags (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

Argh. Right. I wrote this offline and forgot that timechart will create separate data points in the same result row. Yes. In this case it's addtotals. (or untable and then eventstats).

Just for reference, the untable/eventstats alternative would look somewhat like this:

 timechart span=1h count by tempo usenull=false
| untable _time tempo count
| eventstats sum(count) as total by _time
| eval percentage=count/total

But this will give you separate datapoints for every time/tempo pair so you'd need to xyseries that back to table layout.

So @yuanliu 's solution might be indeed more convenient here.

0 Karma

burwell
SplunkTrust
SplunkTrust

Sounds like you might want to use two bin commands.

First bin by time:

| bin _time span=1h

 

Then bin the netPerf.netOriginLatency into 5 (?) bins e.g.

| bin netPerf.netOriginLatency bins=5



See the bin command https://docs.splunk.com/Documentation/Splunk/9.2.2/SearchReference/Bin

Finally you could do a timechart with your bins (you will have to do your percentage etc calculation)

0 Karma
Get Updates on the Splunk Community!

How to Monitor Google Kubernetes Engine (GKE)

We’ve looked at how to integrate Kubernetes environments with Splunk Observability Cloud, but what about ...

Index This | How can you make 45 using only 4?

October 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Splunk Education Goes to Washington | Splunk GovSummit 2024

If you’re in the Washington, D.C. area, this is your opportunity to take your career and Splunk skills to the ...