Hello Splunk Expert,
The situation:
I have a logs file around 10 MB generated from web application errors. this log file contain the sequence of errors that appeared on online system,each time an errors appeared on the systems a number of lines is added to the end of the file, the structure of each event is as following followed by < Severity > followed by a java stack trace.
The structure of the file at time c:
line 1:event a x ---
line 4:event b y ---
last line:event c z ---
The structure of the file at time d could be:
line 4:event b y ---
line 5:event c z ---
last line: event d d ---
the event a will be rotated to another file only if it reach 10 MB
or the log file could be simply
line 1:event a x ---
line 4:event b y ---
line 5:event c z ---
last line: event d d ---
The requirement is to automate the monitoring of this file into splunk.
The two solution i'm intended to implement are:
1-First Solution is to send the delta changes only to splunk server by writing some shell script do get the delta changes each 5 minutes ( the modification that happened to the files in the last 5 minutes )
2-Second Solution is to send via ftp the entire file to splunk instance and let splunk monitor the changes.
Note that: using universal forwarder is not an option - i have no permission to open a port 9997 or any other port for the forwarder on the production environment but i have the possibility of ftp this file from production server to splunk instance
The problem:
My preferred solution is the second but i don't know how to monitor only the delta changes in that file, how splunk will indexes only the new events and detect those lines without having any duplication?
All idea are welcome and appreciated.
Thanks,
Roy
... View more