Getting Data In

TIME_FORMAT ignored

echalex
Builder

Hi,

I'm trying to set timestamp recognition for a sourcetype, in order to avoid recognising timestamp in the event's raw data.

# props.conf:
[my_type]
TIME_PREFIX = ^\[a-z]{3}\s*
TIME_FORMAT = %b %d %H:%M:%S %Y
# Also tried:
# TIME_PREFIX = ^
# TIME_FORMAT = %a %b %d %H:%M:%S %Y
MAX_TIMESTAMP_LOOKAHEAD = 25
SHOULD_LINEMERGE = False
LINE_BREAKER = ([\n\r]+)(?=\w{3} \w{3} \d{1,2} \d{1,2}:\d{1,2}:\d{1,2} \d{4}\)
TRUNCATE = 999999

Well, the actual do get recognised correctly, but I'm afraid Splunk also recognises other date strings. Not consistently, though. Sometimes these timestamps are part of an event and sometimes they trigger a new event.

Tue Aug 21 23:03:51 2012
ALTER SYSTEM ARCHIVE LOG
Tue Aug 21 23:03:51 2012
Thread 1 cannot allocate new log, sequence 10216
Private strand flush not complete
2012-07-22 17:49:47
Thread 1 advanced to log sequence 10215 (LGWR switch)
  Current log# 3 seq# 10215 mem# 0: /bla/bla/bla/bla
2012-07-22 23:01:51
Archived Log entry 10214 added for thread 1 sequence 10214 ID 0xb9999999 dest 1:
2012-07-22 23:03:51
ALTER SYSTEM ARCHIVE LOG
2012-07-22 23:03:51
Thread 1 cannot allocate new log, sequence 10216

So, in this sample, all of the dates from July (with the YYYY-mm-dd format) SHOULD be part of the event with the time stamp 'Tue Aug 21 23:03:51 2012', but this is not the case. One line does end up in that event, but the rest are split into two different events in July.

What am I missing? Whatever I do with TIME_PREFIX and TIME_FORMAT, the forwarder seems to completely ignore these.

Update
I tried to use the data preview feature, which correctly parsed the events. The resulting props.conf looks like follows:

[my_type]
MAX_TIMESTAMP_LOOKAHEAD = 25
NO_BINARY_CHECK = 1
TIME_FORMAT = %a %b %d %H:%M:%S %Y
TIME_PREFIX = ^
pulldown_type = 1

However, using this in the universal forwarder does not help me one bit... I get the same results.

Tags (2)
0 Karma
1 Solution

echalex
Builder

Ok, I finally managed to find the answer myself. Unfortunately, the relevant bits are not explained in the props.conf documentation, but rather in the documentation about the pipeline.

Long story short: extracting the time stamps is done in the parsing phase. And the parsing is done on the indexer and not the forwarder. Unfortunately, this does make administration a bit more complicated and does not allow for formats used in different places. For example, log4j is quite common among java applications, but there is no guarantee that the format is similar to all. (Quite the opposite, actually, since log4j puts heavy emphasis on customization.)

View solution in original post

0 Karma

bruceascot
Explorer

The following sample Splunk search converts a range of date formats to a common target format. In the parsing phase, _time can have a range of timeformat parses executed in the pipeline, using the case command on sourcetype.

index=zip_logfiles

| convert timeformat="%A %e %B %Y" ctime(_time) AS formatOne
| convert timeformat="%e %B %Y" ctime(_time) AS formatTwo
| convert timeformat="%A %e %B" ctime(_time) AS formatThree
| convert timeformat="%A %e %Y" ctime(_time) AS formatFour
| eval my_date=case(sourcetype==one, formatOne,
sourcetype==two, formatTwo,
sourcetype==three, formatThree,
sourcetype=four, formatFour)
| stats sparkline count, sum(duration) as total_Durations by my_date

0 Karma

echalex
Builder

Ok, I finally managed to find the answer myself. Unfortunately, the relevant bits are not explained in the props.conf documentation, but rather in the documentation about the pipeline.

Long story short: extracting the time stamps is done in the parsing phase. And the parsing is done on the indexer and not the forwarder. Unfortunately, this does make administration a bit more complicated and does not allow for formats used in different places. For example, log4j is quite common among java applications, but there is no guarantee that the format is similar to all. (Quite the opposite, actually, since log4j puts heavy emphasis on customization.)

0 Karma

MarioM
Motivator

heavy forwarder can do parsing but not universal forwarder as per doc: http://docs.splunk.com/Documentation/Splunk/latest/Deploy/Typesofforwarders#Forwarder_comparison

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.

Can’t make it to .conf25? Join us online!

Get Updates on the Splunk Community!

Can’t Make It to Boston? Stream .conf25 and Learn with Haya Husain

Boston may be buzzing this September with Splunk University and .conf25, but you don’t have to pack a bag to ...

Splunk Lantern’s Guide to The Most Popular .conf25 Sessions

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...

Unlock What’s Next: The Splunk Cloud Platform at .conf25

In just a few days, Boston will be buzzing as the Splunk team and thousands of community members come together ...