Per: http://docs.splunk.com/Documentation/Storm/Storm/User/Sourcesandsourcetypes
I've tried sending JSON events to Splunk Cloud using all of the following JSONy sourcetypes, none of which seem to result in an accurate timestamp being extracted from my event (instead Splunk Cloud is using the event indexed time):
json_predefined_timestamp
With field: "timestamp": "2014-11-04T20:45:43.000"
json_auto_timestamp
With fields:
"created": 1415133943
or
"time": 1415133943
All to no avail...
Figured it out...
Per: http://answers.splunk.com/answers/180585/json-time-not-always-extracted.html
"timestamp"
must be the first field of the JSON.
If you're JSON-ifying a Ruby Hash, this can be accomplished via:
require 'date'
require 'json'
my_hash = {'taco': 'delicious'}
# Convert the EPOCH timestamp to a Splunk JSON-compat 'timestamp':
new_timestamp = DateTime.strptime('1415133943', '%s').strftime('%Y-%m-%dT%H:%M:%S.%3N')
# Merge the 'timestamp' field into the beginning of the Hash:
my_hash = Hash['timestamp', new_ts].merge!(my_hash)
# print the Hash as a JSON String:
puts JSON(my_hash)
@martin_mueller: I did that deliberately. Thanks though.
Figured it out...
Per: http://answers.splunk.com/answers/180585/json-time-not-always-extracted.html
"timestamp"
must be the first field of the JSON.
If you're JSON-ifying a Ruby Hash, this can be accomplished via:
require 'date'
require 'json'
my_hash = {'taco': 'delicious'}
# Convert the EPOCH timestamp to a Splunk JSON-compat 'timestamp':
new_timestamp = DateTime.strptime('1415133943', '%s').strftime('%Y-%m-%dT%H:%M:%S.%3N')
# Merge the 'timestamp' field into the beginning of the Hash:
my_hash = Hash['timestamp', new_ts].merge!(my_hash)
# print the Hash as a JSON String:
puts JSON(my_hash)
Not sure if relevant, but you're talking about Splunk Cloud while linking Splunk Storm documentation - those are two very different products.