Getting Data In

Trouble extracting time in JSON

ericlarsen
Path Finder

I have a JSON log file that I'm attempting to ingest (Splunk v6.6.5). The events parse correctly, but the epoch time isn't being used as the event timestamp. Splunk is using the file modified date for the event timestamp.

Here's a sample record and my props config (which lives on the Indexers):

{"time":1531405028,"name":"PSIKD01.BOOT","appl":"@ABCVDIF","server":"SERVER1","user":"LSRVID","HandleCount":792,"KernelModeTime":66875000,"OtherOperationCount":18498,"OtherTransferCount":630163,"PageFaults":320216,"PageFileUsage":1349924,"PrivatePageCount":1382322176,"ReadOperationCount":36716,"ReadTransferCount":38844376,"ThreadCount":34,"UserModeTime":363281250,"VirtualSize":2207380942848,"WorkingSetSize":672907264,"WriteOperationCount":205,"WriteTransferCount":63855}

[apm_json]
KV_MODE = none
INDEXED_EXTRACTIONS = json
TIME_PREFIX = "time":
TIME_FORMAT = %s
SHOULD_LINEMERGE = false
TRUNCATE = 100000

Any help would be appreciated. Thanks!

0 Karma

cpetterborg
SplunkTrust
SplunkTrust

Everything looks good in the config. Have you looked to see if there is anything overriding that configuration that might be causing the date parsing problem? Use btool to see what Splunk is actually seeing as the configs:

splunk btool props list --debug | less

Then search for apm_json and see if the configs for that sourcetype match the above configs.

0 Karma

ericlarsen
Path Finder

I had run btool on props previously. I confirmed my sourcetype is active.

0 Karma
Get Updates on the Splunk Community!

Splunk Observability for AI

Don’t miss out on an exciting Tech Talk on Splunk Observability for AI!Discover how Splunk’s agentic AI ...

Splunk Enterprise Security 8.x: The Essential Upgrade for Threat Detection, ...

Watch On Demand the Tech Talk on November 6 at 11AM PT, and empower your SOC to reach new heights! Duration: ...

Splunk Observability as Code: From Zero to Dashboard

For the details on what Self-Service Observability and Observability as Code is, we have some awesome content ...