Splunk Search

How to Split Group by Values?

user33
Path Finder

Hello, I am very new to Splunk. I am wondering how to split these two values into separate rows. The "API_Name" values are grouped but I need them separated by date. Any assistance is appreciated!

SPL:

 

 

index=...

| fields source, timestamp, a_timestamp, transaction_id, a_session_id, a_api_name, api_name, API_ID
| convert timeformat="%Y-%m-%d" ctime(_time) AS date
| eval sessionID=coalesce(a_session_id, transaction_id)
| stats values(date) as date dc(source) as cnt values(timestamp) as start_time values(a_timestamp) as end_time values(api_name) as API_Name by sessionID | where cnt>1
| eval start=strptime(start_time, "%F %T.%Q")
| eval end=strptime(end_time, "%FT%T.%Q")
| eval duration(ms)=abs((end-start)*1000)
| stats count, 
perc95(duration(ms)) as 95thPercentileRespTime(ms) values(API_Name) as API_Name by date

 

 

Splunk1.PNG

Labels (2)
0 Karma
1 Solution

yuanliu
SplunkTrust
SplunkTrust

To group by date and api_name, you put both in groupby clause, right?

index=...

| fields source, timestamp, a_timestamp, transaction_id, a_session_id, a_api_name, api_name, API_ID
| convert timeformat="%Y-%m-%d" ctime(_time) AS date
| eval sessionID=coalesce(a_session_id, transaction_id)
| stats values(date) as date dc(source) as cnt values(timestamp) as start_time values(a_timestamp) as end_time values(api_name) as API_Name by sessionID | where cnt>1
| eval start=strptime(start_time, "%F %T.%Q")
| eval end=strptime(end_time, "%FT%T.%Q")
| eval duration(ms)=abs((end-start)*1000)
| stats count, perc95(duration(ms)) as 95thPercentileRespTime(ms) by date API_Name

 Does this give you what you expect?

View solution in original post

yuanliu
SplunkTrust
SplunkTrust

Your screenshot shows two values under "date", meaning that your last stats command is working to separate values by date.  Can you explain why this is not meeting your requirement?  In other words, what exact output are you expecting?

0 Karma

user33
Path Finder

Hello, I am looking to separate by date and API_Name. I am looking for something like the below. I need a count and 95thPercentileRespTime(ms) specifically for accountstatements-v1 and a count and 95thPercentileRespTime(ms) specifically for Realtime_Image_Access_Service_V2 organized by date.

datecount95thPercentileRespTime(ms)API_Name
2022-11-05xxaccountstatements-v1
2022-11-06xxRealtime_Image_Access_Service_V2
2022-11-06xxaccountstatements-v1
0 Karma

yuanliu
SplunkTrust
SplunkTrust

To group by date and api_name, you put both in groupby clause, right?

index=...

| fields source, timestamp, a_timestamp, transaction_id, a_session_id, a_api_name, api_name, API_ID
| convert timeformat="%Y-%m-%d" ctime(_time) AS date
| eval sessionID=coalesce(a_session_id, transaction_id)
| stats values(date) as date dc(source) as cnt values(timestamp) as start_time values(a_timestamp) as end_time values(api_name) as API_Name by sessionID | where cnt>1
| eval start=strptime(start_time, "%F %T.%Q")
| eval end=strptime(end_time, "%FT%T.%Q")
| eval duration(ms)=abs((end-start)*1000)
| stats count, perc95(duration(ms)) as 95thPercentileRespTime(ms) by date API_Name

 Does this give you what you expect?

user33
Path Finder

I get an error even if I use a comma between: date, API_Name.

0 Karma

user33
Path Finder

It does not unfortunately. I get errors:

user33_0-1667769954783.png

 

0 Karma

yuanliu
SplunkTrust
SplunkTrust

You didn't try the command in my post; it doesn't contain values(API_Name).  (Whether you use comma or not is immaterial.)

0 Karma

user33
Path Finder

Apologies! You are correct. I had a typo. This works perfectly! Thank you very much yuanliu!!

0 Karma
Get Updates on the Splunk Community!

Splunk at Cisco Live 2025: Learning, Innovation, and a Little Bit of Mr. Brightside

Pack your bags (and maybe your dancing shoes)—Cisco Live is heading to San Diego, June 8–12, 2025, and Splunk ...

Splunk App Dev Community Updates – What’s New and What’s Next

Welcome to your go-to roundup of everything happening in the Splunk App Dev Community! Whether you're building ...

The Latest Cisco Integrations With Splunk Platform!

Join us for an exciting tech talk where we’ll explore the latest integrations in Cisco + Splunk! We’ve ...