Hello,
Thanks for you answer, however it does not seem to be working.
I have been experimenting with a small subset, altered data.
This is my test data I am trying to import:
ABE 1900 01 05 19 00 0.00 -3.000 102.000 7.0
ABE 1900 01 11 09 07 0.00 -5.000 148.000 7.0
UTSU 1900 01 18 07 46 0.00 44.500 148.500 6.7
ABE 1900 01 20 06 33 0.00 20.000 -105.000 7.3
UTSU 1900 01 31 19 22 0.00 48.000 146.000 7.5
The fields are tab sepparated.
There is no timestamp available according to the splunk import interface.
So I'm trying to import the file, and assign a timestamp to the events, extracted from the sepparated date fields: year, month, day, ...
I managed to transform the date fields into one field with any desired layout, an example:
[fixEarthquakeDates]
REGEX = (.{3,4})([\t]{1})([\d]{4})([\t]{1})([\d]{2})([\t]{1})([\d]{2})([\t]{1})([\d]{2})([\t]{1})([\d]{2})(.+)
FORMAT = $1$2$3-$5-$7 $9:$11:00.00 $12
DEST_KEY = _raw
SOURCE_KEY = _raw
Formats the data as follows:
ABE 1900-01-05 19:00:00.00 0.00 -3.000 102.000 7.0
ABE 1900-01-11 09:07:00.00 0.00 -5.000 148.000 7.0
UTSU 1900-01-18 07:46:00.00 0.00 44.500 148.500 6.7
ABE 1900-01-20 06:33:00.00 0.00 20.000 -105.000 7.3
Any other format can be achieved with the regex transformation. (I tried several alternatives).
The error I keep getting is:
Could not use strptime to parse timestamp from "1900 01 05 19 00 0.00 -3.000 102.000 7.0".
Failed to parse timestamp. Defaulting to file modtime.
Current settings :
MAX_TIMESTAMP_LOOKAHEAD=0
NO_BINARY_CHECK=true
SHOULD_LINEMERGE=false
TIME_FORMAT=%Y %m %d %H %M %S.%2N
TIME_PREFIX=\t
TRANSFORMS-timestamp=fixEarthquakeDates
TZ=UTC
disabled=false
pulldown_type=true
Have been playing with numerous alternatives for the TIME_FORMAT, to no effect.
Apparently the transformation happens after the parsing for a timestamp ?
Can someone confirm this ? Or point me in the right direction ?
Regards,
Ken.
... View more