I am trying to compare avg_rt for uWSGI workers for the last 15 mins and the last 7 days and then get a percentage out of it. If the difference is more than 50% then I want to trigger an alert.
Here is my search
host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-7d latest=now | stats avg(avg_rt) AS seven_days | append [ search host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-15m latest=now | stats avg(avg_rt) AS fifteen_mins ] | eval Result = (( fifteen_mins / seven_days ) * 100 ) | where Result > 50
I am unable to get a Result for whatever number I choose. it is not able to execute this part
| eval Result = (( fifteen_mins / seven_days ) * 100 ) | where Result > 50
I am getting values for fifteen_mins and seven_days
seven_days fifteen_mins
320588.43640873017 | |
360114.4 |
This works for me: host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-7d latest=now | stats avg(avg_rt) AS seven_days | appendcols [ search host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-15m latest=now | stats avg(avg_rt) AS fifteen_mins ] | eval result = (round((fifteen_mins/seven_days)*100,2)) | where result < 50 or result > 150
This works for me: host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-7d latest=now | stats avg(avg_rt) AS seven_days | appendcols [ search host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-15m latest=now | stats avg(avg_rt) AS fifteen_mins ] | eval result = (round((fifteen_mins/seven_days)*100,2)) | where result < 50 or result > 150
Well, this is understandable. You're using append so Splunk adds one set of results (in this case - a single row) to another set (again - a single row). Since each row has differently named column, you get your "chessboard" 😉
You can't do eval from two different rows (unless you do something with your results first of course).
There are several ways to tackle this. One is using plain stats with conditional evaluation
host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-7d latest=now
| stats avg(eval(if(now()-_time<900,avg_rt,null())) as fifteen_minutes avg(avg_rt) as seven_day
Another way would be to do something similar to your idea (although I don't like using separate searches and appending them - waste of a good search ;))
Just do it like you did before but name both results the same. Add to each of them a label field. Ahd use transpose to switch from two rows to two fields in one row.
Another - a bit mor excentric - idea could be to first bin your data to 15 minute long buckets - you're lucky that 7 days divide evenly into 15-minute periods. Do a separate sum and count for each bucket, count the buckets, sum up all bucket counts and sums into two new field, and calculate your averages from just the last bucket. Hey, I quite like that idea 🙂
host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-7d latest=now
| bin _time span=15m
| stats sum(avg_rt) as fifteensum count(avg_rt) as fifteencount by _time
| sort - _time
| streamstats count as bucketcount
| eventstats sum(fifteensum) as weeksum sum(fifteencount) as weekcount
| where bucketcount=1
| eval fifteenavg=fifteensum/fifteencount
| eval weekavg=weeksum/weekcount
There are often many different approaches you can take with Splunk 🙂
Hi @sohaib112,
try something like this:
host="prod-web-02" source="/var/log/uwsgi/app/uwsgi-metrics.log" earliest=-7d latest=now
| eval kind=if(_time>now()-900,"fifteen_mins","seven_days")
| stats
avg(eval(if(kind="fifteen_minutes",avg_rt,""))) AS fifteen_minutes
avg(eval(if(kind="seven_days",avg_rt,""))) AS seven_days
| eval Result = ((fifteen_mins/seven_days)*100)
| where Result>50
Ciao.
Giuseppe
@gcusello It is not able to calculate the value for fifteen minutes. Hence the result value is empty.