Getting Data In

Failed to parse timestamp on Heavy Forwarder

richnavis
Contributor

I have an App that is indexing data on a Heavy forwarder. The text file has a mix of headers and data, the data containing Timnestamps. I'd like to EXCLUDE the Headers Before the Heavy Forwarder reads it and throws timestamp errors. Is there anyway to do this? Here's what the data looks like.

statusdesclong time probeid responsetime status statusdesc

-------------- ---- ------- ------------ ------ ----------

www-ber 10/28/2013 17:24 34 874 up OK

www-ber 10/28/2013 17:23 64 1763 up OK

0 Karma
1 Solution

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

View solution in original post

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

ShaneNewman
Motivator

Ah, TIME_PREFIX = \t

0 Karma

richnavis
Contributor

Well.. it DID eliminate the headers from my data, thank you! However, I still do the errors in splunkd logs. It also seems that this may be delaying indexing of the data by 5-10 minutes. Here are the errors I see. Are there anyway to avoid these?

DateParserVerbose - Failed to parse timestamp. Defaulting to timestamp of previous event

0 Karma

ShaneNewman
Motivator

That is correct.

0 Karma

richnavis
Contributor

Thank you for your answer.. I'm assuming this is done on the Heavy Forwarder, correct? I will give it a shot on the Heavy Forwarder and let you know..

0 Karma

ShaneNewman
Motivator

Did that help?

0 Karma
Get Updates on the Splunk Community!

Good Sourcetype Naming

When it comes to getting data in, one of the earliest decisions made is what to use as a sourcetype. Often, ...

See your relevant APM services, dashboards, and alerts in one place with the updated ...

As a Splunk Observability user, you have a lot of data you have to manage, prioritize, and troubleshoot on a ...

Splunk App for Anomaly Detection End of Life Announcement

Q: What is happening to the Splunk App for Anomaly Detection?A: Splunk is officially announcing the ...