Hello,
I am trying to organize various types of events into single events.
Currently I have a transaction set up to capture particular types of ERRORS in our system logs.
But there's additional information outside of the transaction that I want to associate with a respective transaction.
To put it plainly, the layout would produce resultant events that look this:
SET_RANGE1
SET_RANGE2
SET_RANGE3
TRANSACTION1
SET_RANGE4
TRANSACTION2
TRANSACTION3
TRANSACTION4
SET_RANGE5
SET_RANGE6
SET_RANGE7
TRANSACTION5
SET_RANGE8
But I want to group and associate the Transaction to the previous SET_RANGE, like this:
Event1 --> SET_RANGE3 , TRANSACTION1
Event2 --> SET_RANGE4 , TRANSACTION2
Event3 --> SET_RANGE4 , TRANSACTION3
Event4 --> SET_RANGE4 , TRANSACTION4
Event5 --> SET_RANGE7 , TRANSACTION5
As you can see, some SET_RANGE events are not needed. Most of the time, there is one SET_RANGE event that happens prior to the given Transaction Event. However, sometimes there are multiple Transaction Events associated to a single SET_RANGE event as I've tried to demonstrate.
Do I need to use savedsearch somehow? Any help is much appreciated!
I'm a little confused with your example and the meaning of set_range... Maybe sample data would help?
But in my dealings with the transaction command, for my scenario, I ended up running a join command to group my data and collected it into a summary index. I then ran the transaction within that new index. My search was: index=foo | join session_id [search user=west_coast] | collect index=west_coast_users
to get the specific data into the summary index and I then run sourcetype=stash | transaction session_id keepevicted=1
(the keepevicted=1 was specific to my needs)... In your case if set_range of data are just searched events, you could pipe that set_range of data into a summary index and run the transaction in there. BTW both JOIN and Transaction are expensive commands.
I'm a little confused with your example and the meaning of set_range... Maybe sample data would help?
But in my dealings with the transaction command, for my scenario, I ended up running a join command to group my data and collected it into a summary index. I then ran the transaction within that new index. My search was: index=foo | join session_id [search user=west_coast] | collect index=west_coast_users
to get the specific data into the summary index and I then run sourcetype=stash | transaction session_id keepevicted=1
(the keepevicted=1 was specific to my needs)... In your case if set_range of data are just searched events, you could pipe that set_range of data into a summary index and run the transaction in there. BTW both JOIN and Transaction are expensive commands.
I have reposted my question here (changed the question a little bit):
Can I close this question?
I do not think anybody will be able to help unless you give us both the raw events and the search that you are using.
I have reposted my question here (changed the question a little bit):
Can I close this question?