Splunk Search

How to generate or repeat the search condition that generates some stats for multiple days?

Sivakesava574
Explorer

Hi, 

using logs i am generating some stats that are needed to track the performance of my app on daily basis using the below query. 

search ...| rex "elapsedTime=(?<ElapsedTime>.*?),\s*MLTime" | rex "X\-ml\-timestamp\: (?<TimeStamp>.*?)\s*\n*X-ml-maxrows" | rex "X\-ml\-size\: (?<size>.*?)\s*\n*X-ml-page" | rex "X\-ml\-page\: (?<page>.*?)\s*\n*X-ml-count"  | rex "X\-ml\-elapsed\-time\: (?<MLelapsed>.*?)\s*\n*X-ml-timestamp" | stats max(size) AS Page_Size max(_time) AS End_Time min(_time) AS Start_Time max(page) as Pages count(page) AS Total_Pages max(ElapsedTime) AS Max_ElapsedTime min(ElapsedTime) AS Min_ElapsedTime avg(ElapsedTime) AS Avg_ElapsedTime max(MLelapsed) AS Max_MLElapsedTime min(MLelapsed) AS Min_MLElapsedTime avg(MLelapsed) AS Avg_MLElapsedTime | eval CASS_Date=strftime(Start_Time, "%Y-%m-%d") | eval CASS_Duration= (End_Time-Start_Time)/60 | eval End_Time=strftime(End_Time, "%Y/%m/%d %T.%3Q") | eval Start_Time=strftime(Start_Time, "%Y/%m/%d %T.%3Q") | table CASS_Date Start_Time End_Time CASS_Duration Page_Size Pages Total_Pages Max_ElapsedTime Min_ElapsedTime Avg_ElapsedTime Max_MLElapsedTime Min_MLElapsedTime Avg_MLElapsedTime

Sivakesava574_0-1644305647379.png

can someone please help me to perform the same above for multiple days with single query instead of i manually collecting these stats on daily basis

Labels (2)
0 Karma

johnhuang
Motivator

This could work:

search ... earliest=-7d@d
| bucket _time span=1d
| rex "elapsedTime=(?<ElapsedTime>.*?),\s*MLTime"
| rex "X\-ml\-timestamp\: (?<TimeStamp>.*?)\s*\n*X-ml-maxrows"
| rex "X\-ml\-size\: (?<size>.*?)\s*\n*X-ml-page"
| rex "X\-ml\-page\: (?<page>.*?)\s*\n*X-ml-count"
| rex "X\-ml\-elapsed\-time\: (?<MLelapsed>.*?)\s*\n*X-ml-timestamp"
| stats max(size) AS Page_Size max(_time) AS End_Time min(_time) AS Start_Time max(page) as Pages count(page) AS Total_Pages max(ElapsedTime) AS Max_ElapsedTime min(ElapsedTime) AS Min_ElapsedTime avg(ElapsedTime) AS Avg_ElapsedTime max(MLelapsed) AS Max_MLElapsedTime min(MLelapsed) AS Min_MLElapsedTime avg(MLelapsed) AS Avg_MLElapsedTime BY _time
| eval CASS_Date=strftime(Start_Time, "%Y-%m-%d")
| eval CASS_Duration= (End_Time-Start_Time)/60
| eval End_Time=strftime(End_Time, "%Y/%m/%d %T.%3Q")
| eval Start_Time=strftime(Start_Time, "%Y/%m/%d %T.%3Q")
| table _time CASS_Date Start_Time End_Time CASS_Duration Page_Size Pages Total_Pages Max_ElapsedTime Min_ElapsedTime Avg_ElapsedTime Max_MLElapsedTime Min_MLElapsedTime Avg_MLElapsedTime

0 Karma

Sivakesava574
Explorer

it worked but, i see some of the fields are not giving the values.

Start_Time,End_Time,CASS_Duration

2022/02/02 00:00:00.000, 2022/02/02 00:00:00.000, 0

these values are populating when running for single day

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

This is because the times are being bucketed into days - try it this way

search ...
| bin _time as time span=1d
| rex "elapsedTime=(?<ElapsedTime>.*?),\s*MLTime"
| rex "X\-ml\-timestamp\: (?<TimeStamp>.*?)\s*\n*X-ml-maxrows"
| rex "X\-ml\-size\: (?<size>.*?)\s*\n*X-ml-page"
| rex "X\-ml\-page\: (?<page>.*?)\s*\n*X-ml-count"
| rex "X\-ml\-elapsed\-time\: (?<MLelapsed>.*?)\s*\n*X-ml-timestamp"
| stats max(size) AS Page_Size max(_time) AS End_Time min(_time) AS Start_Time max(page) as Pages count(page) AS Total_Pages max(ElapsedTime) AS Max_ElapsedTime min(ElapsedTime) AS Min_ElapsedTime avg(ElapsedTime) AS Avg_ElapsedTime max(MLelapsed) AS Max_MLElapsedTime min(MLelapsed) AS Min_MLElapsedTime avg(MLelapsed) AS Avg_MLElapsedTime by time
| eval CASS_Date=strftime(Start_Time, "%Y-%m-%d")
| eval CASS_Duration= (End_Time-Start_Time)/60
| eval End_Time=strftime(End_Time, "%Y/%m/%d %T.%3Q")
| eval Start_Time=strftime(Start_Time, "%Y/%m/%d %T.%3Q")
| table CASS_Date Start_Time End_Time CASS_Duration Page_Size Pages Total_Pages Max_ElapsedTime Min_ElapsedTime Avg_ElapsedTime Max_MLElapsedTime Min_MLElapsedTime Avg_MLElapsedTime
0 Karma
Get Updates on the Splunk Community!

Aligning Observability Costs with Business Value: Practical Strategies

 Join us for an engaging Tech Talk on Aligning Observability Costs with Business Value: Practical ...

Mastering Data Pipelines: Unlocking Value with Splunk

 In today's AI-driven world, organizations must balance the challenges of managing the explosion of data with ...

Splunk Up Your Game: Why It's Time to Embrace Python 3.9+ and OpenSSL 3.0

Did you know that for Splunk Enterprise 9.4, Python 3.9 is the default interpreter? This shift is not just a ...