I have several old log files (5 year retention) which I will need to ingest. I was trying to ingest 1 of them via Add Data but I do not get any results. I have tried the same process again, this time with a nix log and it works fine.
When I upload it I get a default preprocess-winevt sourcetype which on a different Splunk server is the one I use (and works). On this server it just won't work.
I accomplished this same task using file monitoring on Splunk Enterprise installed on a Windows Server 2012. I created an index for the year. Put all the logs for that year into a single directory on the same server and then created a new monitoring input - you should be able to accomplish this through the web interface if you so choose. Start small, with a single evtx or two, and then grow from there.
If your Splunk Enterprise instance is loaded on a Linux system, your mileage may very.
That's a great idea. I would like to be able to do it 1 by 1 like I want to thought for specific reasons. If I wanted to do it your proposed way, I would have to divide the data somehow since, per year, I get over 2TB of Log data.
I will probably do that for another type of environment I have. Thanks for this tip!
Since importing of event log depends on the OS, only the event log corresponding to the OS can be imported.
In order to capture the old format, it is whether to capture it on the corresponding OS or convert the format and import it.
There are two ways to convert a format to a new format and convert it to a text or CSV format.
An example of command conversion.
wevtutil epl e:\hoge_old.evt e:\hoge_new.evtx /lf:true
By old logs I mean not current. They are Win 7 logs (some Win XP). I am not trying to convert them into anything. Splunk already ingests Win 7 and XP logs via Universal Forwarders. The issue lies in the fact that
I have logs that are stored in a different share which I would like to access them on a case by case basis.
What I am trying to do is to ingest them in a way that I will get the same output as if they were being sen to to the search head directly by a forwarder.
My environment has over 1k hosts. I collect 4 different audit logs per host per day. We are talking about 1,460,000 different log files! I cannot simply put them back if that is what you are implying. I should be able to grab the log file, upload it to Splukn through the Add Data utility, and be done.
I know this method works because I have a another setup (Dev Environment) that works exactly that way. What I am trying to figure out is why does it work on the Dev Environment but it wont work in my production environment?