Getting Data In

multiple events and multiple key-value pairs (one being timestamp) in one json

104K
Engager

Hi

I have series of two key-value pairs (timestamp and some other key) on one json file, which looks like below:

{"k":"host|something","rd":[{"n":1347028874805368,"v":1},{"n":1347028874910007,"v":5},{"n":1347028874912282,"v":5},{"n":1347039314560733,"v":1},{"n":1347039314665657,"v":5},
... {"n":1347443694173854,"v":5}]}

My question is how to make "n" value work as timestamp and v value as value of "v". I am guessing it has something to do with transform.conf though...

Any help will be greatly appreciated! Thank you in advance!

1 Solution

daniel_splunk
Splunk Employee
Splunk Employee

|spath|rename rd{}.n AS file_time| rename rd{}.v AS file_count |eval x=mvzip(file_time,file_count) | mvexpand x|eval x = split(x,",") | eval file_time=mvindex(x,0)|eval file_count=mvindex(x,1)|eval file_time = (file_time/1000000)|convert timeformat="%Y:%m:%d:%H:%M:%S" ctime(file_time)|table file_time, file_count

View solution in original post

daniel_splunk
Splunk Employee
Splunk Employee

|spath|rename rd{}.n AS file_time| rename rd{}.v AS file_count |eval x=mvzip(file_time,file_count) | mvexpand x|eval x = split(x,",") | eval file_time=mvindex(x,0)|eval file_count=mvindex(x,1)|eval file_time = (file_time/1000000)|convert timeformat="%Y:%m:%d:%H:%M:%S" ctime(file_time)|table file_time, file_count

104K
Engager

Thank you. Works like a charm. Is there any way I can do the event breaking at indexing time by the way?

0 Karma
Get Updates on the Splunk Community!

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...

Integrating Splunk Search API and Quarto to Create Reproducible Investigation ...

 Splunk is More Than Just the Web Console For Digital Forensics and Incident Response (DFIR) practitioners, ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...