Getting Data In

How to get timestamp from incomplete time field

mblauw
Path Finder

Today I've been trying to index a logfile in which only the timefield hours is given. I tried several ways to import this in the right manner, but I can't seem to get it to work.. Does anybody know how Splunk would be able to index this logfile correctly?

Here is one line from the log. The first element is the date field (here: 2016-12-01), where the second is the hour field (here: 0)

2016-12-01;0;BIM03_R_RWSTI4143;BIM03;ARS;RWS BI-meetnet;BI Meetnet Geo3 ARA;Speed;lus;lane1;60.00;2017-01-23 22:00:00;100.00;0.00;0.00;0.00;95.00;5.00;0.00;0.00;0.00;0.00;0.00;0.00;96.67;3.33;0.00;0.00;0.00;0.00;0.00;0.00;0.00;0.00

0 Karma
1 Solution

somesoni2
Revered Legend

Something like this worked for me for your sample data

[ <SOURCETYPE NAME> ]
SHOULD_LINEMERGE=false
LINE_BREAKER = ([\r\n]+)(?=\d{4}-\d{2}-\d{2}\;)
TIME_PREFIX=^
TIME_FORMAT=%Y-%d-%m;%H
MAX_TIMESTAMP_LOOKAHEAD=13

View solution in original post

somesoni2
Revered Legend

Something like this worked for me for your sample data

[ <SOURCETYPE NAME> ]
SHOULD_LINEMERGE=false
LINE_BREAKER = ([\r\n]+)(?=\d{4}-\d{2}-\d{2}\;)
TIME_PREFIX=^
TIME_FORMAT=%Y-%d-%m;%H
MAX_TIMESTAMP_LOOKAHEAD=13

mblauw
Path Finder

Thank you so much. I've been trying to find out how this works for quite a while now. Do you maybe have any documentation about how to construct such source types? (especially how to build up the LINE_BREAKER component and why TIME_PREFIX=^)

0 Karma
Career Survey
First 500 qualified respondents will receive a $20 gift card! Tell us about your professional Splunk journey.
Get Updates on the Splunk Community!

Thanks for the Memories! Splunk University, .conf25, and our Community

Thank you to everyone in the Splunk Community who joined us for .conf25, which kicked off with our iconic ...

Data Persistence in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. What happens if the OpenTelemetry collector ...

Introducing Splunk 10.0: Smarter, Faster, and More Powerful Than Ever

Now On Demand Whether you're managing complex deployments or looking to future-proof your data ...