Getting Data In

Specify a timestamp field from indexed RESTful JSON

rjlohan
Explorer

Hi,

I am using the REST Modular Input to query some RESTful services. I am having trouble defining the timestamp for this data. It seems that the _time field is being populated with the periodic query time, rather than from the input JSON. I have a sample of the JSON below, and in this case, I want to use the "startTime" field as the event timestamp.

{"id":"7cf85251-d8e0-11e4-9181-0050568e590c","businessKey":"17032015-3AEA-4069-A2DD-SSSS00000006","processDefinitionId":"b8777c40-d832-11e4-968b-0050568e590c","startTime":"2015-04-02T13:32:20","endTime":"2015-04-02T13:44:37","durationInMillis":737088,"startUserId":null,"startActivityId":"EAID_38D5905A_8B84_4c4a_92FD_34D66315F33C","deleteReason":null,"superProcessInstanceId":null,"caseInstanceId":null}

This data comes from a periodic query and so there's alot of duplication (which I eliminate at search time with "dedup id").

My inputs.conf for this input looks like;

[rest://my input name]
auth_type = none
endpoint = http://{server}:{port}/path/to/service
http_method = GET
index_error_response_codes = 0
polling_interval = 120
response_type = json
sourcetype = _json
streaming_request = 0
index = my_index
TIME_PREFIX = "startTime"
MAX_TIMESTAMP_LOOKAHEAD = 200 

Neither of the final 2 settings seem to affect the appended timestamp, and my log is showing errors for this data input;

05-19-2015 08:47:43.870 +1000 WARN  DateParserVerbose - Failed to parse timestamp. Defaulting to timestamp of previous event (Wed Apr 15 00:04:44 2015). Context: source::rest://my input name|host::{server}|_json|245739566646
05-19-2015 08:49:44.164 +1000 WARN  DateParserVerbose - A possible timestamp match (Tue Jan 06 19:40:31 2009) is outside of the acceptable time window. If this timestamp is correct, consider adjusting MAX_DAYS_AGO and MAX_DAYS_HENCE. Context: source::rest://my inpout name|host::{server}|_json|245739566646

I guess it's picking up some other substring (besides the "startTime" value) and failing to parse that, so defaulting back to the query time.

Any ideas on how to solve this issue?

NOTE: This is a 3rd party system I am querying, so I can't manipulate the response JSON.

Tags (2)
0 Karma
1 Solution

Damien_Dallimor
Ultra Champion

Well , you are not using TIME_PREFIX and MAX_TIMESTAMP_LOOKAHEAD correctly. These are parameters of props.conf , not inputs.conf.

Example :

inputs.conf

[rest://my input name]
 auth_type = none
 endpoint = http://{server}:{port}/path/to/service
 http_method = GET
 index_error_response_codes = 0
 polling_interval = 120
 response_type = json
 sourcetype = my_sourcetype
 streaming_request = 0
 index = my_index

props.conf

[my_sourcetype]
TIME_PREFIX = startTime":"
MAX_TIMESTAMP_LOOKAHEAD = 200 

View solution in original post

Damien_Dallimor
Ultra Champion

Well , you are not using TIME_PREFIX and MAX_TIMESTAMP_LOOKAHEAD correctly. These are parameters of props.conf , not inputs.conf.

Example :

inputs.conf

[rest://my input name]
 auth_type = none
 endpoint = http://{server}:{port}/path/to/service
 http_method = GET
 index_error_response_codes = 0
 polling_interval = 120
 response_type = json
 sourcetype = my_sourcetype
 streaming_request = 0
 index = my_index

props.conf

[my_sourcetype]
TIME_PREFIX = startTime":"
MAX_TIMESTAMP_LOOKAHEAD = 200 

rjlohan
Explorer

Thanks again Damien, I missed that completely. Still getting my head around Splunk! This change seems to work.

0 Karma
Get Updates on the Splunk Community!

Extending Observability Content to Splunk Cloud

Watch Now!   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to leverage ...

More Control Over Your Monitoring Costs with Archived Metrics!

What if there was a way you could keep all the metrics data you need while saving on storage costs?This is now ...

New in Observability Cloud - Explicit Bucket Histograms

Splunk introduces native support for histograms as a metric data type within Observability Cloud with Explicit ...