Splunk Search

Aggregated HTTP Status Code per URL per a time bin/bucket

brokenboard525
Engager

Hi,

I have the following fields in logs on my proxy for backend services

_time -> timestamp
status_code -> http status code
backend_service_url -> app it is proxying

What I want to do is aggregate status codes by the minute per URL for each status code.
So sample output would look like:

timebackend-serviceStatus code 200Status code 201status code 202
10:00app1.com10 2
10:01app1.com 10 
10:01app2.com10  


Columns would be dynamic based on the available status codes in the timeframe I am searching.

I found lot of questions on aggregating all 200's into 2xx or total counts by URL but not this. Appreciate any suggestions on how to do this.

Thanks!

Labels (2)
Tags (2)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust
| bin _time span=1m
| stats count by _time backend_service_url status_code
| eval {status_code}=count
| fields - status_code count
| stats values(*) as * by _time backend_service_url

View solution in original post

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| bin _time span=1m
| stats count by _time backend_service_url status_code
| eval {status_code}=count
| fields - status_code count
| stats values(*) as * by _time backend_service_url
0 Karma

brokenboard525
Engager

Right now!

What is the best visualization to plot such multi data sources?
It should illustrate the response codes from each back-end service as the time changes. 

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...