Getting Data In

How can we set TIME_FORMAT in props.conf where the milliseconds vary in length?

ddrillic
Ultra Champion

We are trying to create a TIME_FORMAT where the milliseconds vary in length. Sometimes it is two digits and sometime it is three digits. For example:

 2018-09-11 04:28:05.14
 2018-09-11 04:20:55.336

What can we do?

We have something like %Y-%m-%d %H:%M:%S.%2N but that won't work if the milliseconds are three in length.

0 Karma
1 Solution

harsmarvania57
SplunkTrust
SplunkTrust

Hi @ddrillic,

You can use custom datetime.xml for this type of different timestamp, ref. documentation http://docs.splunk.com/Documentation/SplunkCloud/latest/Data/Configuredatetimexml

If your source type contains only those 2 types of date/time format, then you can use the below configuration.

For testing purpose I have created datetime.xml in $SPLUNK_HOME/etc/apps/search/local/ with below configurations

<!--   Version 4.0 -->

<!-- datetime.xml -->
<!-- This file contains the general formulas for parsing date/time formats. -->

<datetime>

<define name="test_1_date" extract="year, month, day">
        <text><![CDATA[(\d+)-(\d+)-(\d+)]]></text>
</define>

<define name="test_1_time" extract="hour, minute, second, subsecond">
        <text><![CDATA[-\d+-\d+\s(\d+):(\d+):(\d+)\.(\d+)]]></text>
</define>

<timePatterns>
      <use name="test_1_time"/>
</timePatterns>
<datePatterns>
      <use name="test_1_date"/>
</datePatterns>

</datetime>

props.conf in $SPLUNK_HOME/etc/apps/search/local/ with below configurations

[test_sourcetype]
DATETIME_CONFIG = /etc/apps/search/local/datetime.xml
NO_BINARY_CHECK = true
disabled = false
pulldown_type = true
MAX_TIMESTAMP_LOOKAHEAD = 23

Restarted splunk and ingested sample data and it is extracting timestamp as 2018-09-11 04:28:05.140 (Here splunk is assuming subsecond as 140 not 014) and 2018-09-11 04:20:55.336, you can give it try in your test environment.

EDIT: Updated props.conf config (Added MAX_TIMESTAMP_LOOKAHEAD)

View solution in original post

0 Karma

harsmarvania57
SplunkTrust
SplunkTrust

Hi @ddrillic,

You can use custom datetime.xml for this type of different timestamp, ref. documentation http://docs.splunk.com/Documentation/SplunkCloud/latest/Data/Configuredatetimexml

If your source type contains only those 2 types of date/time format, then you can use the below configuration.

For testing purpose I have created datetime.xml in $SPLUNK_HOME/etc/apps/search/local/ with below configurations

<!--   Version 4.0 -->

<!-- datetime.xml -->
<!-- This file contains the general formulas for parsing date/time formats. -->

<datetime>

<define name="test_1_date" extract="year, month, day">
        <text><![CDATA[(\d+)-(\d+)-(\d+)]]></text>
</define>

<define name="test_1_time" extract="hour, minute, second, subsecond">
        <text><![CDATA[-\d+-\d+\s(\d+):(\d+):(\d+)\.(\d+)]]></text>
</define>

<timePatterns>
      <use name="test_1_time"/>
</timePatterns>
<datePatterns>
      <use name="test_1_date"/>
</datePatterns>

</datetime>

props.conf in $SPLUNK_HOME/etc/apps/search/local/ with below configurations

[test_sourcetype]
DATETIME_CONFIG = /etc/apps/search/local/datetime.xml
NO_BINARY_CHECK = true
disabled = false
pulldown_type = true
MAX_TIMESTAMP_LOOKAHEAD = 23

Restarted splunk and ingested sample data and it is extracting timestamp as 2018-09-11 04:28:05.140 (Here splunk is assuming subsecond as 140 not 014) and 2018-09-11 04:20:55.336, you can give it try in your test environment.

EDIT: Updated props.conf config (Added MAX_TIMESTAMP_LOOKAHEAD)

0 Karma

ddrillic
Ultra Champion

Gorgeous answer @harsmarvania57 - much appreciated.

0 Karma

ddrillic
Ultra Champion

Our Sales Engineer said -

-- I mean, the correct way is to fix the logs. Drifting time formats is pretty awful, and would usually indicate there should either be 2 log files or a problem in the code.

Otherwise, just set the TIME_PREFIX and let Splunk do the normal timestamp magic. Both should be automatically parsed, but test that first.

0 Karma
Get Updates on the Splunk Community!

.conf23 | Get Your Cybersecurity Defense Analyst Certification in Vegas

We’re excited to announce a new Splunk certification exam being released at .conf23! If you’re going to Las ...

Starting With Observability: OpenTelemetry Best Practices

Tech Talk Starting With Observability: OpenTelemetry Best Practices Tuesday, October 17, 2023   |  11AM PST / ...

Streamline Data Ingestion With Deployment Server Essentials

REGISTER NOW! Every day the list of sources Admins are responsible for gets bigger and bigger, often making ...