We have a requirement to index a DFS folder containing a lot of subfolders and files from different servers. The goal is to be able to do simple keywords based search against these logs. The logs are different in format, so we will not try to add knowledge by extracting fields. The questions we have are:
1) can we configure Splunk to give the indexing time as the timestamp to an event? We do not want to let him calculate the event timestamp.
2) can we schedule when Splunk has to do the indexing? We would like to run it once per hour, we don't need real time.
3) do you have any tips to apply when you index a lot of files and folders, with many subfolders?
I know it is not really what Splunk is for, but the goal is to have the same tool to do different research and Splunk is now THE tool for log analysis. If he is not able to do this "simple" thing, people would not understand.
Many thanks for your inputs.
... View more