Each log event has more than 1 transaction because we are logging a mini batch log events. So, for every 2 minutes a bunch of transactions are logged as single event. Below is a sample.
In this case, how can I count a no of transactions like no. of Code and no. of minCode. If I do "timechart span=2m count" it gives each log event (contains multiple trans of mini batch logs) as 1.
Please help me find the count of each transactions.
Sample log event...
2021-05-11 21:36:33,634: {"level":"INFO","message":"COMMON_FIELDS - Code:1001 | Status:New | minCode:ABC"} {"level":"INFO","message":"COMMON_FIELDS - Code:1002 | Status:New | minCode:DEF"}{"level":"INFO","message":"COMMON_FIELDS - Code:1003 | Status:Modify | minCode:XYZ"}
2021-05-11 21:38:31,524: {"level":"INFO","message":"COMMON_FIELDS - Code:1011 | Status:New | minCode:RTY"} {"level":"INFO","message":"COMMON_FIELDS - Code:1012 | Status:New | minCode:HJK"}{"level":"INFO","message":"COMMON_FIELDS - Code:1013 | Status:Modify | minCode:VFR"}{"level":"INFO","message":"COMMON_FIELDS - Code:1014 | Status:New | minCode:KLO"}
The result I expect is something like this...
using ==> | timechart span=2m count
_time | count |
2021-05-11 21:26:00 | 3 |
2021-05-11 21:28:00 | 4 |
using ==> | timechart span=5m count
_time | count |
2021-05-11 21:26:00 | 7 |
There are three steps. (OK, four. Step 0 is to beg your developer to write multiple transactions as conformant JSON array instead of just jam them into one unstructured string.) First, transform the concatenated JSON into conformant JSON array. (Your developer should have done this without Splunk. Using rex this way is not robust.) Secondly, extract the appropriate JSON objects. (A combination of spath and mvexpand.) Third, extract Code and minCode. (kv aka extract)
| rex mode=sed "s/ {/ [{/ s/} *{/},{/g s/}$/}]/" ``` transform concatenated JSON into array ```
| eval _raw = replace(_raw, "^[^\[]+", "") ``` retain JSON only ```
| spath path={}
| mvexpand {}
| spath input={}
| rename _raw as temp, message AS _raw
| kv pairdelim="|" kvdelim=":"
Your sample data give me
COMMON_FIELDS___Code | Status | _time | level | minCode |
1001 | New | 2021-05-11 21:36:33.634 | INFO | ABC |
1002 | New | 2021-05-11 21:36:33.634 | INFO | DEF |
1003 | Modify | 2021-05-11 21:36:33.634 | INFO | XYZ |
1011 | New | 2021-05-11 21:38:31.524 | INFO | RTY |
1012 | New | 2021-05-11 21:38:31.524 | INFO | HJK |
1013 | Modify | 2021-05-11 21:38:31.524 | INFO | VFR |
1014 | New | 2021-05-11 21:38:31.524 | INFO | KLO |
Below is data emulation that you can play with and compare with real data
| makeresults
| eval data = mvappend("2021-05-11 21:36:33,634: {\"level\":\"INFO\",\"message\":\"COMMON_FIELDS - Code:1001 | Status:New | minCode:ABC\"} {\"level\":\"INFO\",\"message\":\"COMMON_FIELDS - Code:1002 | Status:New | minCode:DEF\"}{\"level\":\"INFO\",\"message\":\"COMMON_FIELDS - Code:1003 | Status:Modify | minCode:XYZ\"}",
"2021-05-11 21:38:31,524: {\"level\":\"INFO\",\"message\":\"COMMON_FIELDS - Code:1011 | Status:New | minCode:RTY\"} {\"level\":\"INFO\",\"message\":\"COMMON_FIELDS - Code:1012 | Status:New | minCode:HJK\"}{\"level\":\"INFO\",\"message\":\"COMMON_FIELDS - Code:1013 | Status:Modify | minCode:VFR\"}{\"level\":\"INFO\",\"message\":\"COMMON_FIELDS - Code:1014 | Status:New | minCode:KLO\"}")
| mvexpand data
| eval _time = strptime(replace(data, ": .*", ""), "%F %H:%M:%S,%N")
| rename data AS _raw
``` data emulation above ```
Thanks Yuanliu.
I have requested my developer to write multiple transactions as conformant JSON array. But, it will take sometime.
In the meantime, I'll try with your recommendation and let you know.