<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>topic Re: How to group the data by api name, date and various response times? in Splunk Search</title>
    <link>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750312#M242285</link>
    <description>&lt;P&gt;I know you accepted an answer in&amp;nbsp;&lt;A href="https://community.splunk.com/t5/Splunk-Search/Is-there-a-way-to-group-the-data-by-time-and-another-field/m-p/750111#M242257" target="_blank" rel="noopener"&gt;Is there a way to group the data by time and another field?&lt;/A&gt;&amp;nbsp;There is a simpler, perhaps more intuitive approach.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=my_app sourcetype=my_logs:hec (source=my_Logger) msgsource="*" msgtype="*MyClient*" host=*
    [ inputlookup My_Application_Mapping.csv  | search Client="SomeBank" | table appl ]
| eval total_milliseconds = 1000 * (strptime("00:" . elapsed, "%T.%N") - relative_time(now(), "-0d@d"))
| eval timebucket = case(total_milliseconds &amp;lt;= 1000, "TXN_1000", total_milliseconds &amp;lt;= 2000, "1sec-2sec",
  total_milliseconds &amp;lt;= 5000, "2sec-5sec", true(), "5sec+")
| rename msgsource as API
| bucket _time span=1d
| eventstats avg(total_milliseconds) as AvgDur by _time API
| stats count by AvgDur _time API timebucket
| tojson output_field=api_time _time API AvgDur
| chart values(count) over api_time by timebucket
| addtotals
| spath input=api_time
| rename time as _time
| fields - api_time&lt;/LI-CODE&gt;&lt;P&gt;Main ideas:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Organizing time bucket by case function is easier to maintain.&lt;/LI&gt;&lt;LI&gt;Use chart command to perform transpose over one composite field (api_time).&lt;/LI&gt;&lt;LI&gt;Use tojson to pack all information needed after transpose.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;In addition, using strptime to calculate total_milliseconds is more maintainable. (It would be even simpler if Splunk doesn't have a bug near epoc zero.)&lt;/P&gt;</description>
    <pubDate>Wed, 23 Jul 2025 07:13:11 GMT</pubDate>
    <dc:creator>yuanliu</dc:creator>
    <dc:date>2025-07-23T07:13:11Z</dc:date>
    <item>
      <title>How to group the data by api name, date and various response times?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750033#M242249</link>
      <description>&lt;P&gt;I have tried to write a query that outputs the transaction counts, and response times but not sure how to group it by APIs and the date? Here is what I have written so far:&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=my_app sourcetype=my_logs:hec (source=my_Logger) msgsource="*" msgtype="*MyClient*" host=*  [ |inputlookup My_Application_Mapping.csv  | search Client="SomeBank" | table appl ]
| rex field=elapsed "^(?&amp;lt;minutes&amp;gt;\\d+):(?&amp;lt;seconds&amp;gt;\\d+)\\.(?&amp;lt;milliseconds&amp;gt;\\d+)"
| eval total_seconds = (tonumber(seconds) * 1000)
| eval total_milliseconds = (tonumber(minutes) * 60 * 1000) + (tonumber(seconds) * 1000) + (tonumber(milliseconds))
| timechart span=1m cont=f usenull=f useother=f   count(total_milliseconds) as AllTransactions,  
avg(total_milliseconds) as AvgDuration
count(eval(total_milliseconds&amp;lt;=1000)) as "TXN_1000",
count(eval(total_milliseconds&amp;gt;1000 AND total_milliseconds&amp;lt;=2000)) as "1sec-2sec"
count(eval(total_milliseconds&amp;gt;2000 AND total_milliseconds&amp;lt;=5000)) as "2sec-5sec"
count(eval(total_milliseconds&amp;gt;5000 )) as "5sec+",
| timechart span=1d sum(AllTransactions) as "Total" avg(AvgDuration) as AvgDur
sum(TXN_1000)
sum(1sec-2sec)
sum(2sec-5sec) 
sum(5sec+)&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;`msgsource` has my API name. The output of above query is:&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;_time | Total | AvgDur | sum(TXN_1000) | sum (1sec-2sec) | sum(2sec-5sec) | sum(5sec+)&lt;/STRONG&gt;&lt;BR /&gt;2025-07-10| 10000 | 162.12312322 | 1000 | 122 | 1&lt;BR /&gt;&lt;BR /&gt;I want final output to be&lt;BR /&gt;&lt;STRONG&gt;_time| API | Total | AvgDur | sum(TXN_1000) | sum (1sec-2sec) | sum(2sec-5sec) | sum(5sec+)&lt;/STRONG&gt;&lt;BR /&gt;2025-07-10| RetrievePay2 | 10000 | 162.12312322 | 1000 | 122 | 1&lt;BR /&gt;2025-07-10 | RetrievePay5 | 2000 | 62.12131244 | 333 | 56 | 2&lt;/P&gt;&lt;P&gt;2025-07-09| RetrievePay2 | 10000 | 162.12312322 | 1000 | 122 | 1&lt;BR /&gt;2025-07-09 | RetrievePay5 | 2000 | 62.12131244 | 333 | 56 | 2&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any help is appreciated. Thanks!&lt;/P&gt;</description>
      <pubDate>Thu, 17 Jul 2025 21:05:06 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750033#M242249</guid>
      <dc:creator>kuul13</dc:creator>
      <dc:date>2025-07-17T21:05:06Z</dc:date>
    </item>
    <item>
      <title>Re: How to group the data by api name, date and various response times?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750312#M242285</link>
      <description>&lt;P&gt;I know you accepted an answer in&amp;nbsp;&lt;A href="https://community.splunk.com/t5/Splunk-Search/Is-there-a-way-to-group-the-data-by-time-and-another-field/m-p/750111#M242257" target="_blank" rel="noopener"&gt;Is there a way to group the data by time and another field?&lt;/A&gt;&amp;nbsp;There is a simpler, perhaps more intuitive approach.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;index=my_app sourcetype=my_logs:hec (source=my_Logger) msgsource="*" msgtype="*MyClient*" host=*
    [ inputlookup My_Application_Mapping.csv  | search Client="SomeBank" | table appl ]
| eval total_milliseconds = 1000 * (strptime("00:" . elapsed, "%T.%N") - relative_time(now(), "-0d@d"))
| eval timebucket = case(total_milliseconds &amp;lt;= 1000, "TXN_1000", total_milliseconds &amp;lt;= 2000, "1sec-2sec",
  total_milliseconds &amp;lt;= 5000, "2sec-5sec", true(), "5sec+")
| rename msgsource as API
| bucket _time span=1d
| eventstats avg(total_milliseconds) as AvgDur by _time API
| stats count by AvgDur _time API timebucket
| tojson output_field=api_time _time API AvgDur
| chart values(count) over api_time by timebucket
| addtotals
| spath input=api_time
| rename time as _time
| fields - api_time&lt;/LI-CODE&gt;&lt;P&gt;Main ideas:&lt;/P&gt;&lt;UL&gt;&lt;LI&gt;Organizing time bucket by case function is easier to maintain.&lt;/LI&gt;&lt;LI&gt;Use chart command to perform transpose over one composite field (api_time).&lt;/LI&gt;&lt;LI&gt;Use tojson to pack all information needed after transpose.&lt;/LI&gt;&lt;/UL&gt;&lt;P&gt;In addition, using strptime to calculate total_milliseconds is more maintainable. (It would be even simpler if Splunk doesn't have a bug near epoc zero.)&lt;/P&gt;</description>
      <pubDate>Wed, 23 Jul 2025 07:13:11 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750312#M242285</guid>
      <dc:creator>yuanliu</dc:creator>
      <dc:date>2025-07-23T07:13:11Z</dc:date>
    </item>
    <item>
      <title>Re: How to group the data by api name, date and various response times?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750853#M242428</link>
      <description>&lt;P&gt;Thanks for the response.&amp;nbsp;&lt;/P&gt;&lt;P&gt;I was trying to see the accuracy of both the queries as I see difference in the counts, the one that you provided and the one from accepted answer. I edit the query to use some mock data. I am not able to use the mock data with the query in accepted answer, I would appreciate if you can help me fix that so that I can compare the results.&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults format=csv data="interactionid,_time,elapsed,msgsource
1,2025-07-31,00:00.756,retrieveAPI
2,2025-07-31,00:00.556,createAPI
3,2025-07-31,00:00.156,createAPI
4,2025-07-31,00:00.256,updateAPI
5,2025-07-31,00:00.356,retrieveAPI
6,2025-07-31,00:00.156,retrieveAPI
7,2025-07-31,00:01.056,createAPI
8,2025-07-31,00:00.256,retrieveAPI
9,2025-07-31,00:06.256,updateAPI
10,2025-07-31,00:10.256,createAPI"
| rex field=elapsed "^(?&amp;lt;minutes&amp;gt;\\d+):(?&amp;lt;seconds&amp;gt;\\d+)\\.(?&amp;lt;milliseconds&amp;gt;\\d+)"
| eval TimeMilliseconds = (tonumber(minutes) * 60 * 1000) + (tonumber(seconds) * 1000) + (tonumber(milliseconds))
| timechart span=1d count as AllTransactions,  
avg(TimeMilliseconds) as AvgDuration
count(eval(TimeMilliseconds&amp;lt;=1000)) as "TXN_1000",
count(eval(TimeMilliseconds&amp;gt;1000 AND TimeMilliseconds&amp;lt;=2000)) as "1sec-2sec"
count(eval(TimeMilliseconds&amp;gt;2000 AND TimeMilliseconds&amp;lt;=5000)) as "2sec-5sec", by msgsource
| untable _time msgsource count
| eval group=mvindex(split(msgsource,": "),0)
| eval msgsource=mvindex(split(msgsource,": "),1)
| eval _time=_time.":".msgsource
| xyseries _time group count
| eval msgsource=mvindex(split(_time,":"),1)
| eval _time=mvindex(split(_time,":"),0)
| table _time msgsource AllTransactions AvgDuration TXN_1000 "1sec-2sec" "2sec-5sec"&lt;/LI-CODE&gt;&lt;P&gt;This query created the table but the counts are all 0s.&lt;/P&gt;&lt;P&gt;And, here is the edited query&amp;nbsp; that you shared that shows the results:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults format=csv data="interactionid,_time,elapsed,msgsource
1,2025-07-31,00:00.756,retrieveAPI
2,2025-07-31,00:00.556,createAPI
3,2025-07-31,00:00.156,createAPI
4,2025-07-31,00:00.256,updateAPI
5,2025-07-31,00:00.356,retrieveAPI
6,2025-07-31,00:00.156,retrieveAPI
7,2025-07-31,00:01.056,createAPI
8,2025-07-31,00:00.256,retrieveAPI
9,2025-07-31,00:06.256,updateAPI
10,2025-07-31,00:10.256,createAPI"
| eval total_milliseconds = 1000 * (strptime("00:" . elapsed, "%T.%N") - relative_time(now(), "-0d@d"))
| eval timebucket = case(total_milliseconds &amp;lt;= 1000, "TXN_1000", total_milliseconds &amp;lt;= 2000, "1sec-2sec",
  total_milliseconds &amp;lt;= 5000, "2sec-5sec", true(), "5sec+")
| rename msgsource as API
| bucket _time span=1d
| eventstats avg(total_milliseconds) as AvgDur by _time API
| stats count by AvgDur _time API timebucket
| tojson output_field=api_time _time API AvgDur
| chart values(count) over api_time by timebucket
| addtotals
| spath input=api_time
| rename time as _time
| fields - api_time&lt;/LI-CODE&gt;&lt;P&gt;You query shows the correct result but the fields are not in a order how I want to display. Any help to fix both queries would be appreciated.&lt;/P&gt;</description>
      <pubDate>Thu, 31 Jul 2025 21:22:53 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750853#M242428</guid>
      <dc:creator>kuul13</dc:creator>
      <dc:date>2025-07-31T21:22:53Z</dc:date>
    </item>
    <item>
      <title>Re: How to group the data by api name, date and various response times?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750854#M242429</link>
      <description>&lt;P&gt;The first search does not have a valid value for _time, you need to parse the time value from the dummy data&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| makeresults format=csv data="interactionid,_time,elapsed,msgsource
1,2025-07-31,00:00.756,retrieveAPI
2,2025-07-31,00:00.556,createAPI
3,2025-07-31,00:00.156,createAPI
4,2025-07-31,00:00.256,updateAPI
5,2025-07-31,00:00.356,retrieveAPI
6,2025-07-31,00:00.156,retrieveAPI
7,2025-07-31,00:01.056,createAPI
8,2025-07-31,00:00.256,retrieveAPI
9,2025-07-31,00:06.256,updateAPI
10,2025-07-31,00:10.256,createAPI"
| eval _time=strptime(_time,"%F")
| rex field=elapsed "^(?&amp;lt;minutes&amp;gt;\d+):(?&amp;lt;seconds&amp;gt;\d+)\.(?&amp;lt;milliseconds&amp;gt;\d+)"
| eval TimeMilliseconds = (tonumber(minutes) * 60 * 1000) + (tonumber(seconds) * 1000) + (tonumber(milliseconds))
| timechart span=1d count as AllTransactions,  
avg(TimeMilliseconds) as AvgDuration
count(eval(TimeMilliseconds&amp;lt;=1000)) as "TXN_1000",
count(eval(TimeMilliseconds&amp;gt;1000 AND TimeMilliseconds&amp;lt;=2000)) as "1sec-2sec"
count(eval(TimeMilliseconds&amp;gt;2000 AND TimeMilliseconds&amp;lt;=5000)) as "2sec-5sec", by msgsource
| untable _time msgsource count
| eval group=mvindex(split(msgsource,": "),0)
| eval msgsource=mvindex(split(msgsource,": "),1)
| eval _time=_time.":".msgsource
| xyseries _time group count
| eval msgsource=mvindex(split(_time,":"),1)
| eval _time=mvindex(split(_time,":"),0)
| table _time msgsource AllTransactions AvgDuration TXN_1000 "1sec-2sec" "2sec-5sec"&lt;/LI-CODE&gt;</description>
      <pubDate>Thu, 31 Jul 2025 22:00:22 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750854#M242429</guid>
      <dc:creator>ITWhisperer</dc:creator>
      <dc:date>2025-07-31T22:00:22Z</dc:date>
    </item>
    <item>
      <title>Re: How to group the data by api name, date and various response times?</title>
      <link>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750934#M242448</link>
      <description>&lt;BLOCKQUOTE&gt;&lt;HR /&gt;You query shows the correct result but the fields are not in a order how I want to display.&lt;HR /&gt;&lt;/BLOCKQUOTE&gt;&lt;P&gt;The easiest way to get the display order you wanted is to use table command, like&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;| table _time API Total AvgDur TXN_1000 1sec-2sec 2sec-5sec 5sec+&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;a href="https://community.splunk.com/t5/user/viewprofilepage/user-id/225168"&gt;@ITWhisperer&lt;/a&gt;&amp;nbsp;already pointed out the mistake in your data emulation that is fixed by strptime. &amp;nbsp;The two method gives exactly the same result. &amp;nbsp;The only difference is in field names, which can easily be coordinated.&lt;/P&gt;</description>
      <pubDate>Fri, 01 Aug 2025 17:59:07 GMT</pubDate>
      <guid>https://community.splunk.com/t5/Splunk-Search/How-to-group-the-data-by-api-name-date-and-various-response/m-p/750934#M242448</guid>
      <dc:creator>yuanliu</dc:creator>
      <dc:date>2025-08-01T17:59:07Z</dc:date>
    </item>
  </channel>
</rss>

