My log files are stored in nested folders of the following form:
The top level is the date (1/1/2012). The next level is the time (8:45:10.12). I am not seeing a way to handle this kind of folder structure. Seems like Splunk whats files to be in a single folder. Is there a way to deal with this structure?
I am not seeing a way to tell Splunk what file format to use based on the name of the log file. Is this possible?
I am brand new to Splunk and it looks like it does a LOT of stuff really elegantly. Just have some questions.
Thanks for your help!
Do the events in the logfiles themselves have a date/time stamp on each event? As long as it can derive the date/time from the events themselves, it doesn't care about your directory format and can recurse as far as needed in order to find the files. But, if you need to get the date itself from the name/path of the file, that does get a little trickier. We're going to need some additional information to help.
Splunk will recurse by default. For optimum results, the events in the file need a date and a time stamp on each event. As far as the "format" or "schema", you can apply a different sourcetype based on the name of the file, then apply different settings (timestamp recognition, field extraction) based on the sourcetype. This is probably best extracted out to a separate question.
There is a timestamp in each file. I have the parsing logic set up for getting at the data for a particular file. Just not sure how to get the recursion to happen over the whole folder tree. Does this just happen by default?
Also there is still my second question about applying different file format schemas based on the file extension.
Thanks for your help,