I have Nessus event data that looks like this:
4/7/11 5:26:42.000 PM 10.11.5.10 host_end Thu Apr 7 17:26:42 2011
4/7/11 5:23:58.000 PM 10.11.5.10 10.11.5 irdmi (8000/tcp) 33817
4/7/11 5:23:58.000 PM 10.11.5.10 10.11.5 irdmi (8000/tcp) 47830
4/7/11 5:19:22.000 PM 10.11.5.10 10.11.5 irdmi (8000/tcp) 10815
If I want all of the events for 10.11.5.10 to have the timestamp of the result that contains host_end, what's the best way to do that? I've tried using transaction to group the events and then use rex and mvexpand to pull them out again. The problem is that those field extractions are really expensive when you're dealing with 50,000 events in a vulnerability scan.
Any help would be appreciated.
I suppose I would ask why you want those timestamps to be all the same ultimately, i.e. what are you doing with this data that they need identical timestamps, and if they do need them, what field you are using and what you are doing with them. It's possible that this is an unnecessary intermediate step in achieving your final result.
Also, if in fact the timestamps on the data is actually incorrect, perhaps it might be possible to set them correctly on input. That depends on the grouping and ordering of the data in the first place though, e.g. whether it comes in files, whether events from different hosts and times are in the same file, whether individual events are interleaved, etc.
I found one way to do it, though it isn't all that elegant. I searched the scan results for all of the host_start entry and then deduped them by host. That gives me the time of the most recent scan. The host and _time field are exported to a lookup table.
I then search on the scan results, do a lookup to get the most recent scan time, and then use "where" to filter results where the _time field is less than the _time field from the lookup table.