Getting Data In

Problem with json and duplicate events

jarrebola
Explorer

Hi I have this data indexed, as you can see there is only one monitored_element_id.

{"monitored_jobs":[{"monitored_element_id":3954,"work_order_part_id":3954,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-24T11:01:54.564Z"}},{"monitored_element_id":3953,"work_order_part_id":3953,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-24T10:30:12.101Z"}},{"monitored_element_id":3952,"work_order_part_id":3952,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-24T10:26:30.573Z"}},{"monitored_element_id":2958,"work_order_part_id":2960,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-23T10:01:04.399Z"}},{"monitored_element_id":2957,"work_order_part_id":2959,"quantity":5,"priority":4,"status":{"value":"EXECUTING","changed_at":"2019-05-23T07:47:07.137Z"}},{"monitored_element_id":2956,"work_order_part_id":2958,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-23T07:48:04.393Z"}},{"monitored_element_id":2955,"work_order_part_id":2957,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-23T06:43:16.003Z"}},{"monitored_element_id":2954,"work_order_part_id":2956,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-22T18:16:56.819Z"}},{"monitored_element_id":2953,"work_order_part_id":2955,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-22T18:04:46.263Z"}},{"monitored_element_id":2952,"work_order_part_id":2954,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-22T17:55:09.574Z"}},{"monitored_element_id":1953,"work_order_part_id":1953,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-17T13:00:52.043Z"}},{"monitored_element_id":1952,"work_order_part_id":1952,"quantity":5,"priority":4,"status":{"value":"EXECUTING","changed_at":"2019-05-17T12:52:18.216Z"}},{"monitored_element_id":962,"work_order_part_id":981,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-14T07:54:52.097Z"}},{"monitored_element_id":961,"work_order_part_id":980,"quantity":5,"priority":4,"status":{"value":"EXECUTING","changed_at":"2019-05-14T07:34:55.927Z"}},{"monitored_element_id":960,"work_order_part_id":979,"quantity":1,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-14T07:33:12.064Z"}},{"monitored_element_id":959,"work_order_part_id":978,"quantity":1,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-14T07:14:52.501Z"}},{"monitored_element_id":958,"work_order_part_id":977,"quantity":1,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-14T07:10:36.661Z"}},{"monitored_element_id":957,"work_order_part_id":976,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-10T10:46:34.518Z"}},{"monitored_element_id":956,"work_order_part_id":975,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-06T10:48:21.080Z"}},{"monitored_element_id":955,"work_order_part_id":974,"quantity":5,"priority":4,"status":{"value":"HALTED","changed_at":"2019-05-06T10:44:31.145Z"}}],"page":{"size":20,"total_elements":50,"total_pages":3,"number":1}}

But if I make this query:
index="gd_idsp_k8s" source="/var/log/containers/pcc-jobs.log" | spath | table by monitored_jobs{}.monitored_element_id, monitored_jobs{}.quantity,monitored_jobs{}.status.changed_at ,monitored_jobs{}.status.value|dedup monitored_jobs{}.monitored_element_id

I have al the events duplicated, I don't know how to solve this, dedup is not doing anything.:

alt text

Tags (2)
0 Karma

poete
Builder

Hello @jarrebola,

please find below a working request :

| makeresults 
| eval _raw="{\"monitored_jobs\":[{\"monitored_element_id\":3954,\"work_order_part_id\":3954,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-24T11:01:54.564Z\"}},{\"monitored_element_id\":3953,\"work_order_part_id\":3953,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-24T10:30:12.101Z\"}},{\"monitored_element_id\":3952,\"work_order_part_id\":3952,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-24T10:26:30.573Z\"}},{\"monitored_element_id\":2958,\"work_order_part_id\":2960,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-23T10:01:04.399Z\"}},{\"monitored_element_id\":2957,\"work_order_part_id\":2959,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"EXECUTING\",\"changed_at\":\"2019-05-23T07:47:07.137Z\"}},{\"monitored_element_id\":2956,\"work_order_part_id\":2958,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-23T07:48:04.393Z\"}},{\"monitored_element_id\":2955,\"work_order_part_id\":2957,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-23T06:43:16.003Z\"}},{\"monitored_element_id\":2954,\"work_order_part_id\":2956,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-22T18:16:56.819Z\"}},{\"monitored_element_id\":2953,\"work_order_part_id\":2955,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-22T18:04:46.263Z\"}},{\"monitored_element_id\":2952,\"work_order_part_id\":2954,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-22T17:55:09.574Z\"}},{\"monitored_element_id\":1953,\"work_order_part_id\":1953,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-17T13:00:52.043Z\"}},{\"monitored_element_id\":1952,\"work_order_part_id\":1952,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"EXECUTING\",\"changed_at\":\"2019-05-17T12:52:18.216Z\"}},{\"monitored_element_id\":962,\"work_order_part_id\":981,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-14T07:54:52.097Z\"}},{\"monitored_element_id\":961,\"work_order_part_id\":980,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"EXECUTING\",\"changed_at\":\"2019-05-14T07:34:55.927Z\"}},{\"monitored_element_id\":960,\"work_order_part_id\":979,\"quantity\":1,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-14T07:33:12.064Z\"}},{\"monitored_element_id\":959,\"work_order_part_id\":978,\"quantity\":1,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-14T07:14:52.501Z\"}},{\"monitored_element_id\":958,\"work_order_part_id\":977,\"quantity\":1,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-14T07:10:36.661Z\"}},{\"monitored_element_id\":957,\"work_order_part_id\":976,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-10T10:46:34.518Z\"}},{\"monitored_element_id\":956,\"work_order_part_id\":975,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-06T10:48:21.080Z\"}},{\"monitored_element_id\":955,\"work_order_part_id\":974,\"quantity\":5,\"priority\":4,\"status\":{\"value\":\"HALTED\",\"changed_at\":\"2019-05-06T10:44:31.145Z\"}}],\"page\":{\"size\":20,\"total_elements\":50,\"total_pages\":3,\"number\":1}}"
|spath | table monitored_jobs{}.monitored_element_id, monitored_jobs{}.quantity,monitored_jobs{}.status.changed_at ,monitored_jobs{}.status.value|dedup monitored_jobs{}.monitored_element_id
| rename monitored_jobs{}.monitored_element_id as monitored_element_id, monitored_jobs{}.quantity as quantity, monitored_jobs{}.status.changed_at as changed_at, monitored_jobs{}.status.value as value
| eval x=mvzip(monitored_element_id, mvzip(quantity, mvzip(changed_at, value)))
| table x
| mvexpand x
 | eval x = split(x,",")
| eval monitored_element_id=mvindex(x,0)
| eval quantity=mvindex(x,1)
| eval changed_at=mvindex(x,2)
| eval value=mvindex(x,3)
| table monitored_element_id,quantity,changed_at,value
0 Karma

richgalloway
SplunkTrust
SplunkTrust

Maybe I'm missing something, but I see 20 different monitored_element_id fields in that event.

BTW, "by" is not part of the table command syntax. It's treated as a field name.

---
If this reply helps you, Karma would be appreciated.
0 Karma
Get Updates on the Splunk Community!

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...

Splunk MCP & Agentic AI: Machine Data Without Limits

Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization uses ...