Splunk Search

need to add the respective time for the maximum response time

Devi13
Path Finder

index=abc host IN ()
| stats max(response_time) as "Maximum Response Time" by URL
| sort - "Maximum Response Time"

I need to add the respective time for the maximum response time along with the stats.
Coud you please help

Labels (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Devi13,

what do you mean with "respective time"?

if you're meaning time info about the search (es. min_time, max_time, search execution_time, etc...) you could add "| addinfo" at the end of your search and choose the info you want.

for more infos see at https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Addinfo

Ciao.

Giuseppe

0 Karma

Devi13
Path Finder

Hello @gcusello ,

Thank you for your response.

I am looking for something like below,

  URL Maximum Response Time Time at which the maximum response got hit

1abc.com22.3462024-04-24 00:00:25

 

so at 2024-04-24 00:00:25 this time, the url abc.com got the maximum response time

I need to append the time which corresponds to the maximum response time of the url

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Building on @gcusello approach, it can be done more efficiently by not using mvexpand and just filtering out the ones that do not match the max value.

index=abc host IN ()
| eval col=_time."|".response_time
| stats 
    max(response_time) AS max_response_time
    values(col) AS col
    BY URL
| eval times=mvmap(col, if(match(col, "\|".max_response_time."$"), mvindex(split(col, "|"), 0), null()))
| fields URL max_response_time  times
| eval times=strftime(times, "%F %T.%Q")
| rename max_response_time  as "Maximum Response Time"
| sort - "Maximum Response Time"

Note that if you have LOTS of values and lots of URLs you may get a spike in memory usage retaining all the values.

Note this also handles the situation where the max response time occurs in more than one time.

You can also do this with eventstats

index=abc host IN ()
| fields _time response_time URL
| eventstats max(response_time) AS max_response_time by URL
| where response_time=max_response_time
| stats values(max_response_time) AS "Maximum Response Time" 
        values(_time) as times
    BY URL
| eval times=strftime(times, "%F %T.%Q")
| sort - "Maximum Response Time"

Check which will perform better with your data - eventstats can be slow if crunching lots of data.

You can see an example of how this works using either of the techniques above by replacing index=abc... with this, which will give you some simulated data

| makeresults count=1000
| streamstats c
| eval _time=now() - c
| eval response_time=random() % 1000
| eval URL=mvindex(split("URL1,URL2,URL3,URL4",","), random() % 4)
| fields - c

gcusello
SplunkTrust
SplunkTrust

Hi @Devi13,

please try this:

index=abc host IN ()
| eval col=_time."|".response_time
| stats 
    max(response_time) AS "Maximum Response Time" 
    values(col) AS col
    BY URL
| mvexpand col
| rex field=col "^(?<_time>[^\|]+)\|(?<response_time>[.+)"
| where "Maximum Response Time"=response_time
| table URL "Maximum Response Time" _time
| sort - "Maximum Response Time"

Maybe it's also possible using eval in the stats command.

Ciao.

Giuseppe

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Tech Talk Recap | Mastering Threat Hunting

Mastering Threat HuntingDive into the world of threat hunting, exploring the key differences between ...

Observability for AI Applications: Troubleshooting Latency

If you’re working with proprietary company data, you’re probably going to have a locally hosted LLM or many ...

Splunk AI Assistant for SPL vs. ChatGPT: Which One is Better?

In the age of AI, every tool promises to make our lives easier. From summarizing content to writing code, ...