Splunk Search

Avoiding renaming multiple columns using time based streamstats

hollybross1219
Path Finder

Hi there,

I'm trying to create a time series data using streamstats function. Got it figured out, but is there any way to avoid the rename function and have the new columns produced by streamstats replace the old values?

I read the streamstats documentation, but didn't see any optional fields applied for my use cases (or I may not have understood it).

splunk_server=indexer* index=wsi sourcetype=fdpwsiperf partnerId!=*test* partnerId=* error_msg_service=* tax_year=2019 
ofx_appid=* capability=* 
| eval error_msg_service = case(error_msg_service="OK", "Success", match(error_msg_service, "Provider/Host Request Error"), 
"HTTP Request Error", match(error_msg_service, "Provider/Host Response Error"), "HTTP Response Error", 
match(error_msg_service, "Provider/Host Not Available"), "Server Exception", 1==1, "Import Failure") 
| timechart span=10m dc(intuit_tid) by error_msg_service 
| streamstats sum 
| rename "sum(Success)" as "Success", "sum(HTTP Request Error)" as "HTTP Request Error", "sum(HTTP Response Error)" as "HTTP 
Response Error", "sum(Import Failure)" as "Import Failure", "sum(Server Exception)" as "Server Exception"

Thanks!

0 Karma
1 Solution

richgalloway
SplunkTrust
SplunkTrust

You can avoid the separate rename command by using the as option in streamstats.

| streamstats sum(*) as *
---
If this reply helps you, Karma would be appreciated.

View solution in original post

0 Karma

richgalloway
SplunkTrust
SplunkTrust

You can avoid the separate rename command by using the as option in streamstats.

| streamstats sum(*) as *
---
If this reply helps you, Karma would be appreciated.
0 Karma

hollybross1219
Path Finder

Also just to note, I followed an example in the documentation around applying streamstats first and organizing it as a table with _time field, however, it doesn't look like streamstats can accommodate embedded stats functions (I'm applying a sum to the count of a field).

splunk_server=indexer* index=wsi sourcetype=fdpwsiperf partnerId!=*test* partnerId=* 
error_msg_service=* tax_year=2019 ofx_appid=* capability=* 
| eval error_msg_service = case(error_msg_service="OK", "Success", match(error_msg_service, 
"Provider/Host Request Error"), "HTTP Request Error", match(error_msg_service, "Provider/Host Response 
Error"), "HTTP Response Error", match(error_msg_service, "Provider/Host Not Available"), "Server Exception", 
1==1, "Import Failure") 
| streamstats sum(dc(intuit_tid)) as import_requests by error_msg_service
| table _time, import_requests, error_msg_service
0 Karma
Get Updates on the Splunk Community!

Splunk Classroom Chronicles: Training Tales and Testimonials (Episode 4)

Welcome back to Splunk Classroom Chronicles, our ongoing series where we shine a light on what really happens ...

From GPU to Application: Monitoring Cisco AI Infrastructure with Splunk Observability ...

AI workloads are different. They demand specialized infrastructure—powerful GPUs, enterprise-grade networking, ...

Application management with Targeted Application Install for Victoria Experience

  Experience a new era of flexibility in managing your Splunk Cloud Platform apps! With Targeted Application ...