Splunk Search

stats count by generating inconsistent result

noman377
Explorer

I have a very simple search:

index=logs_glbl sourcetype=kube:container:app-name namespace=prod status=500 | stats count

Result: 1

Results are coming from below sample logs:

::ffff:10.244.3.38 - - [06/Aug/2020:20:14:03 +0000] "GET /api/v1/workspace/getEngagement2?id=123 HTTP/1.1" 500 39 "https://atlas.intenal.noman.com" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.105 Safari/537.36"

I have defined a Field type: status for the above which uses Inline Field Extraction:  ^[^"\n]*"(?P<method>\w+)[^"\n]*"\s+(?P<status>\d+)

Now when I perform a new search

index=logs_glbl sourcetype=kube:container:app-name namespace=prod | stats count by status

I don’t get the status 500 error. My results exclude the 500 status. It is also probably missing other http statuses too.

status

count

200

515

302

152

304

8

401

71

409

7

Labels (3)
0 Karma

oscar84x
Contributor

Very simple suggestion, but are you certain that the events within the time window you're using contain status=500? 

Also, out of curiosity, I'm not familiar with what the "P" (?P<status>\d+) in your name capture represents. 

0 Karma

noman377
Explorer

@oscar84x :: Yes. Within the same time frame (e.g., Last 24 hours, Last 7 days), I'm seeing search results that are not consistent. However, The alerts I receive based on "status" is accurate. To extract the http status, like 200, 500 etc., I used the regular expression to create the "status" field extraction. 
Since, "| stats count by status" does not bring the 500 statuses, my dashboard is off not much use.

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Shape the Future of Splunk: Join the Product Research Lab!

Join the Splunk Product Research Lab and connect with us in the Slack channel #product-research-lab to get ...

Auto-Injector for Everything Else: Making OpenTelemetry Truly Universal

You might have seen Splunk’s recent announcement about donating the OpenTelemetry Injector to the ...