I have an HTTP Event Collector input collecting JSON data via syslog forwarder. The syslog-ng message looks like:
body("{ \"source\": \"${.splunk.source}\",
\"event\": ${MSG}
}")
I can see the message and the proper source in my indexer. But time extraction is the problem. Because there is often a delay between the log and the time syslog receives it, I want to use a field in the message to grab the timestamp... A message looks like this:
{"data":"stuff","time_stamp":"2022-05-10 17:14:23Z","value1":"more_stuff"}
So, I create a props.conf on my indexer cluster that looks like:
[my_sourcetype]
DATETIME_CONFIG =
MAX_TIMESTAMP_LOOKAHEAD = 30
TIME_PREFIX = time_stamp\":\"
TIME_FORMAT = %Y-%m-%d %h:%M:%S%Z
TZ = UTC
I've confirmed the sourcetype is correct as I define it in the inputs for HEC and they match. But for the life of me, I can't seem to get Splunk to find the time. I've tried looking for errors using this search:
index = _internal log_level = WARN OR log_level =ERROR "timestamp"
(found in an older community post, so thanks to the author) but I find nothing. I tried playing in the UI and creating a new data type (the add new data widget). In that UI, my props.conf should work. But for some reason on the cluster it doesnt.
Are there any other troubleshooting steps I can follow? Am I missing something that might help this work better?
Is your syslog forwarder a universal forwarder or a heavy forwarder (Splunk Enterprise instance acting as forwarder)?
Also, make sure you're using this as TIME_FORMAT (%H instead of %h for hour).
TIME_FORMAT = %Y-%m-%d %H:%M:%S%Z