Getting Data In

Monitoring the delta changes ( new changes ) in an error logs

royimad
Builder

Hello Splunk Expert,

The situation:
I have a logs file around 10 MB generated from web application errors. this log file contain the sequence of errors that appeared on online system,each time an errors appeared on the systems a number of lines is added to the end of the file, the structure of each event is as following followed by < Severity > followed by a java stack trace.

The structure of the file at time c:

line 1:event a x ---
line 4:event b y ---
last line:event c z ---

The structure of the file at time d could be:
line 4:event b y ---
line 5:event c z ---
last line: event d d ---
the event a will be rotated to another file only if it reach 10 MB

or the log file could be simply
line 1:event a x ---
line 4:event b y ---
line 5:event c z ---
last line: event d d ---

The requirement is to automate the monitoring of this file into splunk.

The two solution i'm intended to implement are:
1-First Solution is to send the delta changes only to splunk server by writing some shell script do get the delta changes each 5 minutes ( the modification that happened to the files in the last 5 minutes )
2-Second Solution is to send via ftp the entire file to splunk instance and let splunk monitor the changes.

Note that: using universal forwarder is not an option - i have no permission to open a port 9997 or any other port for the forwarder on the production environment but i have the possibility of ftp this file from production server to splunk instance

The problem:
My preferred solution is the second but i don't know how to monitor only the delta changes in that file, how splunk will indexes only the new events and detect those lines without having any duplication?

All idea are welcome and appreciated.

Thanks,
Roy

0 Karma
1 Solution

kristian_kolb
Ultra Champion

I think that you should try to;
a) have a forwarder on the machine. Even if it's a production machine, there are bound to be other agents running on it (antivirus, backup, nagios, scom etc etc) so you have a case for arguing there.
b) as an alternative, try to reconfigure so that the log file is rotated every X minutes, and FTP it away to a splunk machine (forwarder or indexer) that can process it.

/K

View solution in original post

kristian_kolb
Ultra Champion

I think that you should try to;
a) have a forwarder on the machine. Even if it's a production machine, there are bound to be other agents running on it (antivirus, backup, nagios, scom etc etc) so you have a case for arguing there.
b) as an alternative, try to reconfigure so that the log file is rotated every X minutes, and FTP it away to a splunk machine (forwarder or indexer) that can process it.

/K

View solution in original post

royimad
Builder

Thanks kris! this was very helpful

0 Karma

kristian_kolb
Ultra Champion

well, hmm... if you could make the file transfer append to an existing file... and only start on a new destination file when the source file has been rotated... but that sounds like there are things that could go wrong, and then the blame-shifting would start.

As for encryption, ssl, etc, there is built-in support for these concepts. You'll have to generate new ssl-keys and configure splunk to use them.

http://docs.splunk.com/Documentation/Splunk/5.0.1/Security/ConfigureSplunkforwardingtousesignedcerti...

0 Karma

royimad
Builder

a)is not an option for me at the moment - otherwise i need to proof of security concept ( encryption , ssl , etc.)

b) Rotate every X minutes this could be part of the first solution. How about monitoring the delta changes from splunk directly? do you think this could be done?

0 Karma
Did you miss .conf21 Virtual?

Good news! The event's keynotes and many of its breakout sessions are now available online, and still totally FREE!