Splunk Search

Can you help me format a table that would generate the highest CPU users per hour over a day?

mikclrk
Explorer

G'Day

I've got some data I'm pulling out of some events with a search:

HOUR - Two digit hour of the day
PROCESS - Name of a running process
CPU_USAGE - The CPU the process used during the hour

What I want is a table with Hour in the first column, then the 10 processes with the highest CPU usage within that hour. Not the most popular process (which is what TOP seems to give me), but the ones with the highest CPU usage. So 240 rows it the finished table, 10 per hour.

I can get the top 10 in the first hour. I can get the 10 highest users, but I can't seem to get the highest 10 users within each hour.

Something like:

00 ProcessA 75%
00 ProcessB 60%
...
00 ProcessG 10%

01 ProcessC 90%
01 ProcessA 45%
01 ProcessG 40%
...
01 ProcessF 3%

02 ProcessB 80%
...

Any hints would be appreciated.

The second part is creating a chart to show the same...

1 Solution

sideview
SplunkTrust
SplunkTrust

There are a couple ways.

One is that you can leverage the chart command's limit arg. It will only let a given value of process into the split-by of a given hour if that process is one of the highest cpu times.

Then we use untable to turn it from a chart-style set into a stats style set. then a simple stats rolls up the 10 values.

<your search terms>
| chart sum(cputime) over hour by process limit=10 useother=f
| untable hour process cputime
| stats values(process) by hour

Another arguably more complex way is to use streamstats. Here we use stats to get all the total CPU usages for all hour + process combinations. Then we sort all of them by hour and then by cputime.
Then we use streamstats to "paint" integers on each. The top ten processes within each hour will end up getting "painted" with the integers 1-10. So we can then filter all the rest out easily. Then one more stats to roll it up.

<your search terms>
| stats sum(cputime) as cputime by hour process
| sort 0 - hour cputime
| streamstats count by hour 
| where count<=10
| stats values(process) by hour

I hope that helps.

View solution in original post

sideview
SplunkTrust
SplunkTrust

There are a couple ways.

One is that you can leverage the chart command's limit arg. It will only let a given value of process into the split-by of a given hour if that process is one of the highest cpu times.

Then we use untable to turn it from a chart-style set into a stats style set. then a simple stats rolls up the 10 values.

<your search terms>
| chart sum(cputime) over hour by process limit=10 useother=f
| untable hour process cputime
| stats values(process) by hour

Another arguably more complex way is to use streamstats. Here we use stats to get all the total CPU usages for all hour + process combinations. Then we sort all of them by hour and then by cputime.
Then we use streamstats to "paint" integers on each. The top ten processes within each hour will end up getting "painted" with the integers 1-10. So we can then filter all the rest out easily. Then one more stats to roll it up.

<your search terms>
| stats sum(cputime) as cputime by hour process
| sort 0 - hour cputime
| streamstats count by hour 
| where count<=10
| stats values(process) by hour

I hope that helps.

mikclrk
Explorer

The chart option kept coming back with no data found, but the streamstats approach worked fine.

Any thoughts of how to get trellis to draw me a bar chart (showing process and cputime) for each hour?

0 Karma

sideview
SplunkTrust
SplunkTrust

I did see a typo in my syntax for the one that didn't work - I had "valuies()" instead of "values()"

hehe. I figured your actual desired end goal wasn't that values() output. OK I'll try and circle back when I get a chance.

0 Karma
Get Updates on the Splunk Community!

Federated Search for Amazon S3 | Key Use Cases to Streamline Compliance Workflows

Modern business operations are supported by data compliance. As regulations evolve, organizations must ...

New Dates, New City: Save the Date for .conf25!

Wake up, babe! New .conf25 dates AND location just dropped!! That's right, this year, .conf25 is taking place ...

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud

Introduction to Splunk Observability Cloud - Building a Resilient Hybrid Cloud  In today’s fast-paced digital ...