We have a batch runner that outputs a log file for each run. We would like to be able to get the duration of each log file so we can see how the batch times change when we make modifications.
Currently I am using:
source=/some/source* | transaction maxevents=-1 maxpause=-1 maxspan=1d keepevicted=t source | table duration, source
Unfortunatly because the log files are very large this is incredibly slow and even causes splunk to run out of memory. Is there a quicker way?
NOTE: The files do not have a consistant ending line to match on but all start with the same statement.
Maybe I'm not seeing the whole picture, but what about
... | stats min(_time) AS start max(_time) AS stop by source| eval dur = stop - start | ...
building transactions require some memory, as you've noted..
/Kristian
Maybe I'm not seeing the whole picture, but what about
... | stats min(_time) AS start max(_time) AS stop by source| eval dur = stop - start | ...
building transactions require some memory, as you've noted..
/Kristian
thanks, thats what I was looking for 🙂
The report as is doesn't run fully so I can't create a summary. Also the table ends up having multiple durations for source which doesn't make sense so I think the transaction command is either broken or I am not using it properly.
have you thought about summary indexing your search?