I have 10 servers for my X applications. Sometime 1 or 2 servers will start to take 10% (or < 25%) where other 8 servers are taking normal traffic. How can I set up an alert in such scenario.
application=X host=websphere* source=X_performance.log "New Session logged in" | timechart span=1m count by host
I can identify 1 or 2 servers take low traffic. Not too sure how can I set this into alert.
Thanks,
Try to write something like this and set alerts?
<your base search> |stats max(traffic_field) as max_traffic min(traffic_field) as min_traffic | eval alert_me=(max_traffic - min_traffic) | fields alert_me
Try to write something like this and set alerts?
<your base search> |stats max(traffic_field) as max_traffic min(traffic_field) as min_traffic | eval alert_me=(max_traffic - min_traffic) | fields alert_me
Hope your issue is resolved? Kindly accept and/or upvote an answer which helped you and close this request.
If you already have a search that returns you the servers with low traffic just save it as an alert and set it to trigger an alert action (notification, email, script, etc) if the search returns any results. if no server has the low traffic condition, the search should not return results thus not alert is triggered.
More info from docs regarding trigger conditions for alerts
https://docs.splunk.com/Documentation/Splunk/7.3.1/Alert/AlertTriggerConditions
My search query only good if I am inspecting the graph. I don't think its good enough to alert. My thought process is.. I want to compare the server whose take most traffic and the server who takes lesser traffic.. then alert of the base of that.
s1 = 100
s2 = 109
s3 = 100
s4 = 110
S5 = 23
In this case.. I want an alert because S5 is taking very less traffic compare to S4 (Max traffic).