How do you ingest events of a particular time period from a file in Splunk?


I have a log file of about 400 MB in size. I don't want to ingest it completely. I just want a few events from a particular time period.
For Example: I just want to ingest events from 11am to 12pm, rest I don't want to collect in Splunk.

Is it possible to do it ? If yes, then what could be the best practice to do so ?

Thanks in Advance!

Tags (1)
0 Karma



Best practices would be to avoid writing events to files that you don't want to collect or separate into different files based on the type or usability and then forward only those files to Splunk which are interesting for you.

If both of these options are not applicable for you, then the next solution would be filtering the events before they get into the Splunk indexes. The Splunk forwarder forwards all events, whatever it gets, without any filter. So we need to have an intermediate forwarder(heavy forwarder) to do the filtering before the event reaches the indexer or do the filtering at indexer side before it gets indexed.

Depending on your Splunk deployment, you could do it on either side i.e. HF or indexer.

Here is a documentation which explains how to use the filtering Discard specific events and keep the rest OR Keep specific events and discard the rest based on the amount of data you discard or accept. In your case, you have to do the filtering on time. i.e. send only data from 11am to 12pm. Refer to this example from @MuS which explains How to index certain logs only during a certain time range

0 Karma

Super Champion

the data is local on indexer or remote(on forwarder)
You can use "scripted inputs"... using a shell script or python, you can fetch the data and send it to splunk.

0 Karma