Getting Data In

Extract json fields

falcon
Observer

I have multiple fields under the interesting fields section named field1, field2, field3, and so on. Each of these fields contains a JSON key–value pair. I also have other fields that are already properly extracted and do not follow this field1, field2 pattern.

How can I extract values from these field1, field2, etc. fields? For example, if field1 contains {"username": "testUser"}, how can I filter for the value "testUser"? The raw data looks fine json logs. 

Labels (1)
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @falcon ,

at first use INDEXED_EXTRACTIONS = JSON in the sourcetype definition and then use, as @ITWhisperer said, the spath command (for more infos see at ( https://help.splunk.com/en/splunk-enterprise/search/spl-search-reference/9.0/search-commands/spath ).

Ciao.

Giuseppe

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Wait.

1. Indexed extractions have nothing to do with spath command. And as I understand the question, only parts of the original event form json structures, not the whole raw event. In this case indexed extractions won't work anyway.

2. As a rule of thumb, indexed extractions shouldn't be used unless there is no other way.

isoutamo
SplunkTrust
SplunkTrust

It should works as already said. 

If it didn't work then maybe you have too long json event or it something like json but not exactly.

Please give some more information to us if needed.

ITWhisperer
SplunkTrust
SplunkTrust

You can use spath, for example:

| spath input=field1
0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...