Splunk Search

How to calculate time from event beginning and end.


I would like to display the average time Oracle is taking to perform a check point. I have filtered out the following from the log file which show the timestamp for the beginning of the check point as well as the end.

Note the common link is the SCN number. So this example below would yeild 5 seconds.

Mon Oct 24 15:06:58 2011 Completed
checkpoint up to RBA [0x1cbf.2.10],
SCN: 8494298653192

Mon Oct 24 15:06:53 2011 Beginning log
switch checkpoint up to RBA
[0x1cbf.2.10], SCN: 8494298653192

How would one go about calculating this?


Tags (3)
0 Karma

Splunk Employee
Splunk Employee

There are several ways. First, you need to make sure SCN is extracted as a field (I call it SCN below), and that of course the time stamps on each event are correctly detected. Then, the most efficient would be:

... | stats range(_time) as duration by SCN

more transparently:

... | stats min(_time) as start max(_time) as end by SCN | eval duration=end-start

Less efficiently, but more intuitively:

... | transaction SCN

as the transaction command will automatically compute duration from the first and last event times.