Getting Data In

Is there a configuration where I can set the DATETIME_FORMAT for Universal Forwarder?

gtonti
Explorer

I am using a Universal Forwarder to send data (log files) to Splunk.
My log files contains a timestamp at the beginning of the row. For example:

(07/09/2018 12:55:40) ;Info;........

The date/time is to be intended as 7 September 2018 12:55:40 (dd/MM/YYYY .....).
Splunk indexes the row as 9 July 2018 12:55:40 (MM/dd/YYYY ......).

Is there a configuration where I can set the DATETIME_FORMAT for Universal Forwarder (looks to me that a props.conf is not used by UF)

Thanks
Kind Regards

Gianluca

0 Karma
1 Solution

gjanders
SplunkTrust
SplunkTrust
Is there a configuration where I can set the DATETIME_FORMAT for Universal Forwarder (looks to me that a props.conf is not used by UF)

Unless you have set the force_local_processing option (this option is not recommended for most circumstances) or are using indexed extractions than no timestamp parsing and line breaking is done by the universal forwarder. The first Splunk enterprise instance to receive the data (Heavy forwarder or indexer) will do the parsing (sometimes nicknamed cooking the data)

Therefore your settings need to be on the indexer or heavy forwarder, not the universal forwarder.

View solution in original post

0 Karma

gjanders
SplunkTrust
SplunkTrust
Is there a configuration where I can set the DATETIME_FORMAT for Universal Forwarder (looks to me that a props.conf is not used by UF)

Unless you have set the force_local_processing option (this option is not recommended for most circumstances) or are using indexed extractions than no timestamp parsing and line breaking is done by the universal forwarder. The first Splunk enterprise instance to receive the data (Heavy forwarder or indexer) will do the parsing (sometimes nicknamed cooking the data)

Therefore your settings need to be on the indexer or heavy forwarder, not the universal forwarder.

0 Karma

gtonti
Explorer

I sent the data to a HF and ther i made config in props.conf. This worked fine.

Thanks
Bye

0 Karma

sudosplunk
Motivator

Hi,
Set below in your props.conf on UF. These configs (same copy of props) should be pushed to indexer(s) as well.

[your_sourcetype]
TIME_PREFIX = \(  ## Use '^' if you don't have '(' before date
TIME_FORMAT = %d/%m/%Y %H:%M:%S
MAX_TIMESTAMP_LOOKAHEAD = 20
0 Karma

gtonti
Explorer

Hi nittala_surya,

I tried but it didn't change.
It looks to me that props.conf in not used by UniversalForwarder.
in mi UniversalForwarder installation in the path
%SPLUNK_HOME%\etc\system\local I have:

inputs.conf
[monitor://C:\xxxx]
disabled = false
index = yyyyyyy
sourcetype = mysourcetype
crcSalt =
whitelist = (?i).\xxxx\.\.*.(txt|csv|log)

props.conf
[mysourcetype]
NO_BINARY_CHECK = 1
SHOULD_LINEMERGE=false
pulldown_type = 1
TIME_FORMAT = %d/%m/%Y %H:%M:%S
MAX_TIMESTAMP_LOOKAHEAD = 20

Thanks
Gianluca

0 Karma

sudosplunk
Motivator

Can you add “TIME_PREFIX” and give it a try one more time? Also see if there is a conflict in configurations by running btool on your props splunk btool props list --debug |grep “mysourcetype”

0 Karma

gtonti
Explorer

hi nittala_surya

I tried again (with TIME_PREFX) but it didn't change.

Bye

0 Karma

sudosplunk
Motivator

The above configurations for timestamp extraction are pretty straight forward and should work as expected. Can you confirm your if line breaking is working as expected, are all these lines being broken into single events starting with (07/09/2018 12:55:40) ;Info;........? If not, you have to configure LINE_BREAKER to something like below:

[your_sourcetype]
LINE_BREAKER = ([\r\n]+)\(\d{2}\/\d{2}\/\d{4}\s
SHOULD_LINEMERGE = false 
TIME_PREFIX = \(
 TIME_FORMAT = %d/%m/%Y %H:%M:%S
 MAX_TIMESTAMP_LOOKAHEAD = 20
0 Karma

gtonti
Explorer

Hi nittala_surya,

I sent the data to a HF and ther i made config in props.conf. This worked fine.

Thanks
Bye

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In September, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...