Thanks for the response. I was trying to see the accuracy of both the queries as I see difference in the counts, the one that you provided and the one from accepted answer. I edit the query to use some mock data. I am not able to use the mock data with the query in accepted answer, I would appreciate if you can help me fix that so that I can compare the results. | makeresults format=csv data="interactionid,_time,elapsed,msgsource
1,2025-07-31,00:00.756,retrieveAPI
2,2025-07-31,00:00.556,createAPI
3,2025-07-31,00:00.156,createAPI
4,2025-07-31,00:00.256,updateAPI
5,2025-07-31,00:00.356,retrieveAPI
6,2025-07-31,00:00.156,retrieveAPI
7,2025-07-31,00:01.056,createAPI
8,2025-07-31,00:00.256,retrieveAPI
9,2025-07-31,00:06.256,updateAPI
10,2025-07-31,00:10.256,createAPI"
| rex field=elapsed "^(?<minutes>\\d+):(?<seconds>\\d+)\\.(?<milliseconds>\\d+)"
| eval TimeMilliseconds = (tonumber(minutes) * 60 * 1000) + (tonumber(seconds) * 1000) + (tonumber(milliseconds))
| timechart span=1d count as AllTransactions,
avg(TimeMilliseconds) as AvgDuration
count(eval(TimeMilliseconds<=1000)) as "TXN_1000",
count(eval(TimeMilliseconds>1000 AND TimeMilliseconds<=2000)) as "1sec-2sec"
count(eval(TimeMilliseconds>2000 AND TimeMilliseconds<=5000)) as "2sec-5sec", by msgsource
| untable _time msgsource count
| eval group=mvindex(split(msgsource,": "),0)
| eval msgsource=mvindex(split(msgsource,": "),1)
| eval _time=_time.":".msgsource
| xyseries _time group count
| eval msgsource=mvindex(split(_time,":"),1)
| eval _time=mvindex(split(_time,":"),0)
| table _time msgsource AllTransactions AvgDuration TXN_1000 "1sec-2sec" "2sec-5sec" This query created the table but the counts are all 0s. And, here is the edited query that you shared that shows the results: | makeresults format=csv data="interactionid,_time,elapsed,msgsource
1,2025-07-31,00:00.756,retrieveAPI
2,2025-07-31,00:00.556,createAPI
3,2025-07-31,00:00.156,createAPI
4,2025-07-31,00:00.256,updateAPI
5,2025-07-31,00:00.356,retrieveAPI
6,2025-07-31,00:00.156,retrieveAPI
7,2025-07-31,00:01.056,createAPI
8,2025-07-31,00:00.256,retrieveAPI
9,2025-07-31,00:06.256,updateAPI
10,2025-07-31,00:10.256,createAPI"
| eval total_milliseconds = 1000 * (strptime("00:" . elapsed, "%T.%N") - relative_time(now(), "-0d@d"))
| eval timebucket = case(total_milliseconds <= 1000, "TXN_1000", total_milliseconds <= 2000, "1sec-2sec",
total_milliseconds <= 5000, "2sec-5sec", true(), "5sec+")
| rename msgsource as API
| bucket _time span=1d
| eventstats avg(total_milliseconds) as AvgDur by _time API
| stats count by AvgDur _time API timebucket
| tojson output_field=api_time _time API AvgDur
| chart values(count) over api_time by timebucket
| addtotals
| spath input=api_time
| rename time as _time
| fields - api_time You query shows the correct result but the fields are not in a order how I want to display. Any help to fix both queries would be appreciated.
... View more