Getting Data In

Failed to parse timestamp on Heavy Forwarder

richnavis
Contributor

I have an App that is indexing data on a Heavy forwarder. The text file has a mix of headers and data, the data containing Timnestamps. I'd like to EXCLUDE the Headers Before the Heavy Forwarder reads it and throws timestamp errors. Is there anyway to do this? Here's what the data looks like.

statusdesclong time probeid responsetime status statusdesc

-------------- ---- ------- ------------ ------ ----------

www-ber 10/28/2013 17:24 34 874 up OK

www-ber 10/28/2013 17:23 64 1763 up OK

0 Karma
1 Solution

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

View solution in original post

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

ShaneNewman
Motivator

Ah, TIME_PREFIX = \t

0 Karma

richnavis
Contributor

Well.. it DID eliminate the headers from my data, thank you! However, I still do the errors in splunkd logs. It also seems that this may be delaying indexing of the data by 5-10 minutes. Here are the errors I see. Are there anyway to avoid these?

DateParserVerbose - Failed to parse timestamp. Defaulting to timestamp of previous event

0 Karma

ShaneNewman
Motivator

That is correct.

0 Karma

richnavis
Contributor

Thank you for your answer.. I'm assuming this is done on the Heavy Forwarder, correct? I will give it a shot on the Heavy Forwarder and let you know..

0 Karma

ShaneNewman
Motivator

Did that help?

0 Karma
Get Updates on the Splunk Community!

Splunk Observability Cloud's AI Assistant in Action Series: Auditing Compliance and ...

This is the third post in the Splunk Observability Cloud’s AI Assistant in Action series that digs into how to ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

What You Read The Most: Splunk Lantern’s Most Popular Articles!

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...