Getting Data In

Using DATETIME_CONFIG=NONE with a DATETIME_CONFIG fails to extract dates

khevans
Path Finder

I'm having trouble parsing a log file that has a format similar to this format:

2019-07-08 14:03:59.335 INFO [Filename.java:91] Processing the following from Queue 
------------------------------------
Some text can go here

------------------------------------
2019-07-08 14:03:59.340 INFO [Filename.java:118] Received, will be ignored
2019-07-08 14:03:60.340 INFO [SomethingElse.java:118] Received, will be ignored

Note that it always starts with this date and can be several lines long.

I am currently trying this as my sourcetype in props.ini, but it does not extract the date:

SHOULD_LINEMERGE=true
LINE_BREAKER=([\r\n]+)
NO_BINARY_CHECK=true
BREAK_ONLY_BEFORE=\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2}\.\d{3}\s[\w+\s\[\w\.]+(\:\d+)?\]
# \d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2} also fails
DATETIME_CONFIG=NONE
TIME_FORMAT=%Y-%m-%d %H:%M:%S
TIME_PREFIX=^

It gives me the "fails to parse timestamp, reverting to modtime" message and I don't know why. I am trying a sample of only 3 lines and each line matches that format but none of the dates extract. Why is this?

0 Karma
1 Solution

Azeemering
Builder

This will work as a minimum. Make sure to include all 8 golden props.

[ your_sourcetype ]
SHOULD_LINEMERGE=true
LINE_BREAKER=([\n\s]*)\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}.\d{3}
NO_BINARY_CHECK=true
TIME_FORMAT=%Y-%m-%d %H:%M:%S.%3N
TIME_PREFIX=^
MAX_TIMESTAMP_LOOKAHEAD=23

You should always set the following 8:

TIME_PREFIX = regex of the text that leads up to the timestamp
MAX_TIMESTAMP_LOOKAHEAD = how many characters for the timestamp
TIME_FORMAT = strftime format of the timestamp
# for multiline events: SHOULD_LINEMERGE should always be set to false as LINE_BREAKER will speed up multiline events
SHOULD_LINEMERGE = false
# Wherever the LINE_BREAKER regex matches, Splunk considers the start
# of the first capturing group to be the end of the previous event
# and considers the end of the first capturing group to be the start of the next event.
# Defaults to ([\r\n]+), meaning data is broken into an event for each line
LINE_BREAKER = regular expression for event breaks
TRUNCATE = 999999 (always a high number / not 0)
# Use the following attributes to handle better load balancing from UF.
# Please note the EVENT_BREAKER properties are applicable for Splunk Universal
# Forwarder instances only. Valid with forwarders > 6.5.0
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = regular expression for event breaks

View solution in original post

0 Karma

Azeemering
Builder

This will work as a minimum. Make sure to include all 8 golden props.

[ your_sourcetype ]
SHOULD_LINEMERGE=true
LINE_BREAKER=([\n\s]*)\d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}.\d{3}
NO_BINARY_CHECK=true
TIME_FORMAT=%Y-%m-%d %H:%M:%S.%3N
TIME_PREFIX=^
MAX_TIMESTAMP_LOOKAHEAD=23

You should always set the following 8:

TIME_PREFIX = regex of the text that leads up to the timestamp
MAX_TIMESTAMP_LOOKAHEAD = how many characters for the timestamp
TIME_FORMAT = strftime format of the timestamp
# for multiline events: SHOULD_LINEMERGE should always be set to false as LINE_BREAKER will speed up multiline events
SHOULD_LINEMERGE = false
# Wherever the LINE_BREAKER regex matches, Splunk considers the start
# of the first capturing group to be the end of the previous event
# and considers the end of the first capturing group to be the start of the next event.
# Defaults to ([\r\n]+), meaning data is broken into an event for each line
LINE_BREAKER = regular expression for event breaks
TRUNCATE = 999999 (always a high number / not 0)
# Use the following attributes to handle better load balancing from UF.
# Please note the EVENT_BREAKER properties are applicable for Splunk Universal
# Forwarder instances only. Valid with forwarders > 6.5.0
EVENT_BREAKER_ENABLE = true
EVENT_BREAKER = regular expression for event breaks
0 Karma
Get Updates on the Splunk Community!

Earn a $35 Gift Card for Answering our Splunk Admins & App Developer Survey

Survey for Splunk Admins and App Developers is open now! | Earn a $35 gift card!      Hello there,  Splunk ...

Continuing Innovation & New Integrations Unlock Full Stack Observability For Your ...

You’ve probably heard the latest about AppDynamics joining the Splunk Observability portfolio, deepening our ...

Monitoring Amazon Elastic Kubernetes Service (EKS)

As we’ve seen, integrating Kubernetes environments with Splunk Observability Cloud is a quick and easy way to ...