Hi,
I have a logfile which looks like this:
2018-12-06 02:53:18 * [13396] PASSED: ftp file X20181206025051227_XXXTracking.csv renamed to 20181206025051227_XXXTracking.csv
2018-12-06 02:53:18 * [13396] PASSED: ftp 20181206025051227_XXXTracking.csv -> company@ftp06.XXX-group.eu:out
My props.conf looks this:
[spdh120]
TRANSFORMS = setnull-test,spdh120
TIME_PREFIX = ^
MAX_TIMESTAMP_LOOKAHEAD = 30
TIME_FORMAT = %Y-%m-%d %H:%M:%S
SHOULD_LINEMERGE = false
TRUNCATE = 0
EXTRACT-MESSAGE = \d{4}-\d{2}-\d{2}\s\d{2}:\d{2}:\d{2}\s\*\s\[\d{5}\]\sPASSED:\sftp\s\d{17}(?<FILE>.+)\s->\s(?<RECEIVER>.+)@ftp06.gls-group.eu:out
and my transforms.conf:
[spdh120]
DEST_KEY = queue
FORMAT = indexQueue
REGEX = @ftp06.gls-group.eu:out
But, I still get this error in my logfile and no data into my indexer:
12-06-2018 03:23:46.252 +0100 WARN DateParserVerbose - Failed to parse timestamp in first MAX_TIMESTAMP_LOOKAHEAD (30) characters of event. Defaulting to timestamp of previous event (Thu Dec 6 03:23:17 2018). Context: source=/e/logs/spdh120_20181206.log|host=udts|spdh120
Can anyone help me and tell me what I configured wrong?
There are two problems: the failure during parsing the timestamp and that I didn't get any data into Splunk from that logfile.
Thx for your help
Problem solved. I had a typo:
DEST_Key and not DEST_KEY. After I changed it, it solved my problem.
Problem solved. I had a typo:
DEST_Key and not DEST_KEY. After I changed it, it solved my problem.
Perhaps it's just a typo in the question, but the TIME_FORMAT
string has two spaces between date and time whereas the sample events have a single space. That's enough of a difference to prevent parsing.
Really interesting @richgalloway - is there a way around hard-coding space(s) in the TIME_FORMAT
field?
Regrettably not. TIME_FORMAT
is not a regex string so we can't use something like '\s+'. It's literal characters except for the metacharacters used in strptime()
.
Thank you @richgalloway.
I still have two spaces in my props.conf and it works with them.
Hi,
this I have made because we have other entries where this works. I tried it first with only one space between day and hour, but same error.