Yes this works...Thanks
I see the idea...
1. create a combined column with category.#.duration value (or other values want to combine)
2. do some calculations
3. separate out the category.#.duration later on.
In my case I also had to add a stats command to re-combine repeated rows
1. recombine any duplicated rows
2. finalise the stats
index=mydata type=RESPONSE request_type=*
| eval duration=response_in_timestamp-(request_in_timestamp+request_out_timestamp)
| stats p95(duration) as duration_95pc count by domain, response_out_http_code, index
| eval tps = round(count/60)
| eval statusCategory=if(response_out_http_code>=200 AND response_out_http_code < 300, "OK",
if(response_out_http_code>=400 AND response_out_http_code < 500, "ClientError", if(response_out_http_code>=500 AND response_out_http_code < 600, "ServerError", "Other")))
| stats avg(duration_95pc) as duration_95pc sum(count) as count by domain, statusCategory, index
| eval domain=domain."#".duration_95pc
| xyseries domain,statusCategory, count
| fillnull value=0 ClientError, ServerError, OK, Other
| rex field=domain "(?<domain>.+)#(?<duration_95pc>.+)"
| stats sum(ClientError) as ClientError, sum(ServerError) as ServerError, sum(OK) AS OK, sum(Other) as Other, sum(Total) as Total avg(ClientError_%) as ClientError_%, avg(ServerError_%) as ServerError_%, avg(Success_%) as Success_%, avg(Other_%) as Other_% avg(duration_95pc) by domain
| eval Total = ClientError + OK + ServerError + Other
| eval ServerError_%= round((ServerError/Total) * 100)
| eval ClientError_%= round((ClientError/Total) * 100)
| eval Success_%= round((OK/Total) * 100)
| eval Other_%= round((Other/Total) * 100)
But couple of things worth pointing out to any future readers.
My choice of field name 95pc_duration seems to be a bad idea. The eval command would not work with this field until I renamed it... ? Must be because it has a number at the beginning. I guess eval gets confused ... is it a number or is it a string I suspect it is saying to itself.
I also needed to add a final stats command as I had some duplicateed rows.
... View more