I have an App that is indexing data on a Heavy forwarder. The text file has a mix of headers and data, the data containing Timnestamps. I'd like to EXCLUDE the Headers Before the Heavy Forwarder reads it and throws timestamp errors. Is there anyway to do this? Here's what the data looks like.
statusdesclong time probeid responsetime status statusdesc
-------------- ---- ------- ------------ ------ ----------
www-ber 10/28/2013 17:24 34 874 up OK
www-ber 10/28/2013 17:23 64 1763 up OK
props.conf
[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv
transforms.conf
[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue
[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue
[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc
props.conf
[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv
transforms.conf
[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue
[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue
[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc
Ah, TIME_PREFIX = \t
Well.. it DID eliminate the headers from my data, thank you! However, I still do the errors in splunkd logs. It also seems that this may be delaying indexing of the data by 5-10 minutes. Here are the errors I see. Are there anyway to avoid these?
DateParserVerbose - Failed to parse timestamp. Defaulting to timestamp of previous event
That is correct.
Thank you for your answer.. I'm assuming this is done on the Heavy Forwarder, correct? I will give it a shot on the Heavy Forwarder and let you know..
Did that help?