Splunk Search

timechart sum

sphiwee
Contributor

 

index="acoe_np_spa_metrics"
| search Project="*" AND Volume="*" 
| timechart span=1mon count(eval(D_Status="F")) as success_count
  count(eval(D_Status="S")) as failure_count count as Total
| eval STP=(success_count/Total)*100 
| fields - Total

 

 

Good day, I have the above SPL query it gives me the count of "F"s and "S"s but I need the sum of Volumes where D_Status = F and sum of Volume where D_Status = S 

Labels (3)
0 Karma
1 Solution

kamlesh_vaghela
SplunkTrust
SplunkTrust

@sphiwee 

Can you please try this?

index="acoe_np_spa_metrics" 
| search Project="*" AND Volume="*" 
| timechart span=1mon sum(eval(if(D_Status="F",Volume,0))) as success_count
    count(eval(if(D_Status="S",Volume,0))) as failure_count count as Total 
| eval STP=(success_count/Total)*100 
| fields - Total

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

View solution in original post

kamlesh_vaghela
SplunkTrust
SplunkTrust

@sphiwee 

Can you please try this?

index="acoe_np_spa_metrics" 
| search Project="*" AND Volume="*" 
| timechart span=1mon sum(eval(if(D_Status="F",Volume,0))) as success_count
    count(eval(if(D_Status="S",Volume,0))) as failure_count count as Total 
| eval STP=(success_count/Total)*100 
| fields - Total

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...