Getting Data In

Failed to parse timestamp on Heavy Forwarder

richnavis
Contributor

I have an App that is indexing data on a Heavy forwarder. The text file has a mix of headers and data, the data containing Timnestamps. I'd like to EXCLUDE the Headers Before the Heavy Forwarder reads it and throws timestamp errors. Is there anyway to do this? Here's what the data looks like.

statusdesclong time probeid responsetime status statusdesc

-------------- ---- ------- ------------ ------ ----------

www-ber 10/28/2013 17:24 34 874 up OK

www-ber 10/28/2013 17:23 64 1763 up OK

0 Karma
1 Solution

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

View solution in original post

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

ShaneNewman
Motivator

Ah, TIME_PREFIX = \t

0 Karma

richnavis
Contributor

Well.. it DID eliminate the headers from my data, thank you! However, I still do the errors in splunkd logs. It also seems that this may be delaying indexing of the data by 5-10 minutes. Here are the errors I see. Are there anyway to avoid these?

DateParserVerbose - Failed to parse timestamp. Defaulting to timestamp of previous event

0 Karma

ShaneNewman
Motivator

That is correct.

0 Karma

richnavis
Contributor

Thank you for your answer.. I'm assuming this is done on the Heavy Forwarder, correct? I will give it a shot on the Heavy Forwarder and let you know..

0 Karma

ShaneNewman
Motivator

Did that help?

0 Karma
Get Updates on the Splunk Community!

.conf24 | Day 0

Hello Splunk Community! My name is Chris, and I'm based in Canberra, Australia's capital, and I travelled for ...

Enhance Security Visibility with Splunk Enterprise Security 7.1 through Threat ...

(view in My Videos)Struggling with alert fatigue, lack of context, and prioritization around security ...

Troubleshooting the OpenTelemetry Collector

  In this tech talk, you’ll learn how to troubleshoot the OpenTelemetry collector - from checking the ...