Hi,
I have the following sample event data.
- For some reason, there is no sub-seconds-order data for the timestamp.
- Original event data does not have ts_SEQ field; I just added for the reference.
ts, ev_id, val, ts_SEQ
2018-6-17 08:00:01, A, 10, 1
2018-6-17 08:00:01, B, 0, 2
2018-6-17 08:00:01, C, 3, 3
2018-6-17 08:00:11, A, 20, 4
2018-6-17 08:00:11, B, 0, 5
2018-6-17 08:00:11, C, -1, 6
2018-6-17 08:00:20, A, 5, 7
2018-6-17 08:00:21, B, 0, 8
2018-6-17 08:00:21, C, 12, 9
What I want to do is to extract transactions; It seems that events A-B-C (ev_id) make one transaction group, almost every ten seconds. For example, transaction startswith=eval(ev_id="A") endswith=eval(ev_id="C") maxspan=2s could be applied.
When I indexed the above sample event data, at Set Source Type step, I chose Source Type=CSV, Timestamp Extraction=AUTO. What I got by spl source=... | table _time ts ev_id val ts_SEQ | sort _time is shown below.
Unfortunately, the ascending order of _time destroys the original event sequence (ts_SEQ), that means,
the above mentioned transaction extraction would be impossible. Currently, my work around is to add the ts_SEQ value to the original data before indexing, and use ts_SEQ to keep the original event sequence.
Question: How to extract timestamp without destroying the sequence of original events from my sample data?
Thank you.
... View more