As a brand new user, I'm attempting to add several log files as input to my installation of splunk. These are server log files of about 3GB+. Whenever I attempt to "Upload and index a file" I get a "Your entry was not saved. The following error was reported: server abort." message.
I have managed to upload some access logs, but it generally takes me multiple tries to do so. The access log files are about 85MB in size.
Any thoughts? We are using 4.3
Try setting Splunk to read the directory, and not the file itself. My guess is it's trying to load the whole thing into memory or something.
Brian
@madmoravian:
"moving the file to the Splunk server allowed it to successfully save and process the log file."
Moving it to which directory path on the Splunk server? Thanks.
If that does not work, try splinting the file into smaller chunks and see if your server can index the files then. On *NIX use the split command:
split -b 1000k largefile.big smallerfiles
You can also split it by lines if you know how many lines make up an event:
split -l 1000 largefile.big smallerfiles
This will then create the 'smallerfiles' with the suffix aa-zz.
Try setting Splunk to read the directory, and not the file itself. My guess is it's trying to load the whole thing into memory or something.
Brian
+1 to this.
Yes. moving the file to the Splunk server allowed it to successfully save and process the log file.
Let me know if this works, and we'll convert this to an answer..
Not yet, as the file is not on the splunk server. I might move it over there and see what happens. Thanks for the suggestion.
Have you tried pointing Splunk @ the directory instead of at the file itself?
Brian