Getting Data In
Highlighted

How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Communicator

Hello Splunkers,

We have an event coming in from our logs below with this stamp right at the beginning of our logs.
That is good...

Event TIme Stamp
11/30/15:11:16 AM

Unfortunately Splunk gets confused on the Year and believes it is the start of the time
stamp. See below 15 = 3:00 PM. I think I just need to somehow get rid of the colon in the
above data after the year and get a space in there before it is read and I think I will be good.

Would I need a props with a SED statement to strip it out on indexing? Any ideas to support
my theory would be greatly helpful.

Splunk Output
11/30/15 3:11:16.000 PM

Thanks,
Daniel MacGillivray

0 Karma
Highlighted

Re: How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Motivator

As I mentioned in a comment above, SEDCMD is evaluated after timestamp extraction, so you can't fix this via transform. You can, however, explicitly tell Splunk what the time format is. (Details on How Indexing Works)

props.conf

[your_sourcetype]
TIME_PREFIX=^
TIME_FORMAT=%m/%d/%I:%M:%S %p
MAX_TIMESTAMP_LOOKAHEAD=17

This tells Splunk that the timestamp comes at the beginning of the event (TIME_PREFIX), it has the above strftime format, and it extends, at most, 17 characters into the event. Everything it needs to know to get the timestamp right.

This may still not work, as without a year it's not a valid timestamp, so Splunk may still do funny things with it. The real fix is to get your developers to log in a non-ridiculous format. (What Java devs have against ISO standard timestamps, I'll never figure out)

I would also set a couple other things:

[your_sourcetype]
TIME_PREFIX=^
TIME_FORMAT=%m/%d/%I:%M:%S %p
MAX_TIMESTAMP_LOOKAHEAD=17
SHOULD_LINEMERGE=false
LINE_BREAKER=([\r\n]+)(?:\d{2}\/\d{2}\/\d{2}:\d{2}:\d{2})
TRUNCATE=999999

This further tells Splunk how to handle the incoming events. Specifically, we're telling Splunk where events begin and end explicitly, so it doesn't have to figure it out. (You'll appreciate this when it stops Splunk from doing bad things with stacktraces)

Ideally, you should be setting all of these for every new sourcetype you ingest whenever possible.

View solution in original post

Highlighted

Re: How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Communicator

Thanks Emiller, All very helpful. I will attempt in dev and let you know how it worked out. Unfortunately it is a third party software we are dealing with in this case. Good point on the ISO stamp. I hope to carry the torch and help others down the road like you have !

0 Karma
Highlighted

Re: How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Motivator

Good luck! Depending on how the 3rd-party app is set up, you may still be able to define the logging format. (It may just be defined in a config file which you can update) Worth looking into.

0 Karma
Highlighted

Re: How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Communicator

Thanks Emiller42.. It is working now !! Good way to end the day !!

0 Karma
Highlighted

Re: How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Communicator

P.S. I used your added LINE_BREAKER settings.

0 Karma
Highlighted

Re: How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Esteemed Legend

SEDCMD can be used to change the raw data but not before it is used to determine a timestamp (my previous answer was in error).

0 Karma
Highlighted

Re: How to correct timestamp recognition that is currently skewed due to result of class "java.util.logging.Logger" output

Communicator

Thanks for jumping in on this as it was much appreciated !! 🙂

0 Karma