Getting Data In

Failed to parse timestamp on Heavy Forwarder

richnavis
Contributor

I have an App that is indexing data on a Heavy forwarder. The text file has a mix of headers and data, the data containing Timnestamps. I'd like to EXCLUDE the Headers Before the Heavy Forwarder reads it and throws timestamp errors. Is there anyway to do this? Here's what the data looks like.

statusdesclong time probeid responsetime status statusdesc

-------------- ---- ------- ------------ ------ ----------

www-ber 10/28/2013 17:24 34 874 up OK

www-ber 10/28/2013 17:23 64 1763 up OK

0 Karma
1 Solution

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

View solution in original post

ShaneNewman
Motivator

props.conf

[sourcetype_name_here]
TIME_PREFIX = \s
TIME_FORMAT = %m/%d/%Y %H:%M
SHOULD_LINEMERGE = false
TRANSFORMS-0_null_queue = nullq_header, nullq_dash
REPORT-0_field_kv = field_kv

transforms.conf

[nullq_header]
REGEX = statusdesclong
DEST_KEY = queue
FORMAT = nullQueue

[nullq_dash]
REGEX = ^\-\-\-\-
DEST_KEY = queue
FORMAT = nullQueue

[field_kv]
DELIMS = "\t"
FIELDS = statusdesclong, time, probeid, responsetime, status, statusdesc

ShaneNewman
Motivator

Ah, TIME_PREFIX = \t

0 Karma

richnavis
Contributor

Well.. it DID eliminate the headers from my data, thank you! However, I still do the errors in splunkd logs. It also seems that this may be delaying indexing of the data by 5-10 minutes. Here are the errors I see. Are there anyway to avoid these?

DateParserVerbose - Failed to parse timestamp. Defaulting to timestamp of previous event

0 Karma

ShaneNewman
Motivator

That is correct.

0 Karma

richnavis
Contributor

Thank you for your answer.. I'm assuming this is done on the Heavy Forwarder, correct? I will give it a shot on the Heavy Forwarder and let you know..

0 Karma

ShaneNewman
Motivator

Did that help?

0 Karma
Get Updates on the Splunk Community!

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...

3-2-1 Go! How Fast Can You Debug Microservices with Observability Cloud?

Register Join this Tech Talk to learn how unique features like Service Centric Views, Tag Spotlight, and ...