We have a system that produces several GB of logs per day. Of that there is only maybe a few MB that contains information that is worth searching through in Splunk. What I would like to do is set up a scheduled search that saves just the events we care about to a seperate index to help improve search speeds.
I have been trying to use summary indexing for this but I seem to lose all the _time data that go with the events. Is there anyway just to save the filtered events?
I have given it a go but the _time field doesn't get save in the index I specify. It just defaults all the events to the time I started the search. Is there something else I need to do?