Hi all,
Hoping someone can give some pointers how to solve this problem:
I run a transaction command on the last two weeks, which gives about 20.000 events, and for about 85 percent of events ...
See more...
Hi all,
Hoping someone can give some pointers how to solve this problem:
I run a transaction command on the last two weeks, which gives about 20.000 events, and for about 85 percent of events the transaction command combines the events perfectly.
However, for the remaining 13% there are still duplicate 's meaning that the transaction command has not combined them properly.
I think this is due to memory limits in the limits.conf and these could be increased, but it seems that there should be smarter options.
For example appending new events with a transaction command on an existing lookup if that is possible.
Or perhaps there is a better way of combining the information without using transaction at all.
The downside of the dataset is that transactions can occur over the entire two weeks; which means I cannot filter on maxspan, also filters on maxevents don't improve performance since the transactions can vary a lot.
Cheers,
Roelof
the minimal search:
index= sourcetype= earliest=@d-14d
| fields ...
| transaction keeporphans=True keepevicted=True
| outputlookup .csv
This is the full minimal search ^
Two examples of the snippets from the correct dataset would be:
(id number deleted, but just an integer on which transaction is performed)
SYSMODTIME is a multivalue field, and there are a couple more mv fields in the complete dataset