Splunk Search

## How to calculate average for the data which will get changed based on time interval ?

Communicator

I have below kind of data.

App Name Status
App1                0
App2               0
App3               0
App4               0
App5               0
App6               1
App7               0
App8               0
App9               0
App10            0

0 - Success
1 - Failure

Assign,
0 as 100%
1 as 0%

Here status value will get updated every 5 mins. So my requirement is to calculate the average from the starting time of day till present time by app. In this table, success percentage will be 90%. Similarly query should work if i want to calculate the average time after n'th update.

Next thing is i want to keep current status in one variable. and my final output should be having 2 data, one is current status and other one is Average data.

Labels (1)
• ### stats

1 Solution
SplunkTrust

Try this

``````| makeresults | eval _raw="App Name Status
App1                0
App2               0
App3               0
App4               0
App5               0
App6               1
App7               0
App8               0
App9               0
App10            0" | multikv forceheader=1
```Above is just for test data```
| stats sum(Status) as status,count, latest(Status) as Current by App
| eval Average= 100-((status*100)/count), Current=100-(Current*100)
| table App, Current, Average``````
---
If this reply helps you, Karma would be appreciated.
SplunkTrust

Here's a run-anywhere search for the first set of requirements.

``````| makeresults | eval _raw="App Name Status
App1                0
App2               0
App3               0
App4               0
App5               0
App6               1
App7               0
App8               0
App9               0
App10            0" | multikv forceheader=1
``` Above just defines test data```
| stats sum(Status) as status,count
| eval SuccessRate= 100-((status*100)/count)
| table SuccessRate``````

I don't understand the remaining requirements.  What is the average of?  Over what time?

---
If this reply helps you, Karma would be appreciated.
Communicator

@richgalloway Thanks for your reply.

As i mentioned, Status of each App will get updated every 5 mins, It's live data.

For eg, if you take App1, starting from day...Means from 12.00 AM..

At 12.00 AM, first status will be updated for all the apps. then at 00.05, 00.10, 00.15........It goes on

Finally what my requirement is, to calculate the average and current status of each app separately by assigning 100 to success (ie. 0) and 0 to failure(ie. 1). I need this day wise, so next day it should start from 12 AM again.

If i run my query at anytime in a day, it should give me current status and average status for each app like below.

The final output table should be,

 App Current Status Average Status App1 100 90% App2 100 100% App3 0 80% App4 100 100% App5 100 100%

Hope this clarifies you.

SplunkTrust

Try this

``````| makeresults | eval _raw="App Name Status
App1                0
App2               0
App3               0
App4               0
App5               0
App6               1
App7               0
App8               0
App9               0
App10            0" | multikv forceheader=1
```Above is just for test data```
| stats sum(Status) as status,count, latest(Status) as Current by App
| eval Average= 100-((status*100)/count), Current=100-(Current*100)
| table App, Current, Average``````
---
If this reply helps you, Karma would be appreciated.
Communicator
Get Updates on the Splunk Community!

#### Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...

#### Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

#### More Control Over Your Monitoring Costs with Archived Metrics GA in US-AWS!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...