Splunk Search

How to extract from complex JSON file?

SplunkDash
Motivator

Hello,

I have complex JSON events ingested as *.log files. I have issues (or couldn't do) with extracting fields from this files/events. Any help on how to extract Key-Value pairs from these events would be highly appreciated. One sample event is given below. Thank you so much.

 

2022-07-15 12:44:03 - {

    "type" : "TEST",

    "r/o" : false,

    "booting" : false,

    "version" : "6.2.7.TS",

    "user" : "DS",

    "domainUUID" : null,

    "access" : "NATIVE",

    "remote-address" : "localhost",

    "success" : true,

    "ops" : [{

        "address" : [

            {

                "subsystem" : "datasources"

            },

            {

                "data-source" : "mode_tp"

            }

        ],

 

"address" : [

                {

                    "cservice" : "management"

                },

                {

                    "access" : "identity"

                }

            ],

            "DSdomain" : "TESTDomain"

        },

        {

            "address" : [

                {

                    "cservice" : "management"

                },

 

{

            "operation" : "add",

            "address" : [

                {

                    "subsystem" : "finit"

                },

                {

                    "bucket" : "TEST"

                },

                {

                    "clocal" : "passivation"

                },

                {

                    "store" : "file"

                }

            ],

            "passivation" : true,

            "purge" : false

        },

        {

            "operation" : "add",

            "address" : [

                {

                    "subsystem" : "finit"

                },

                {

                    "bucket" : "TEST"

                }

            ],

            "module" : "dshibernate"

        },

        {

            "operation" : "add",

            "address" : [

                {

                    "subsystem" : "finit"

                },

                {

                    "bucket" : "hibernate"

                },

                {

                    "clocal" : "entity"

                }

            ]

        },

        {

            "operation" : "add",

            "address" : [

                {

                    "subsystem" : "finit"

                },

                {

                    "bucket" : "hibernate"

                },

                {

                    "clocal" : "entity"

                },

                {

                    "component" : "transaction"

                }

            ],

            "model" : "DSTEST"

        },

        {

            "operation" : "add",

            "address" : [

                {

                    "subsystem" : "infit"

                },

                {

                    "bucket" : "hibernate"

                },

                {

                    "clocal" : "entity"

                },

                {

                    "memory" : "object"

                }

            ],

            "size" : 210000

        },

 

{

            "operation" : "add",

            "address" : [

                {

                    "subsystem" : "DS"

                },

                {

                    "workplace" : "default"

                },

                {

                    "running-spin" : "default"

                }

            ],

            "Test-threads" : 45,

            "queue-length" : 60,

            "max-threads" : 70,

            "keepalive-time" : {

                "time" : 20,

                "unit" : "SECONDS"

            }

        },

        {

            "operation" : "add",

            "address" : [

                {

                    "subsystem" : "DS"

                },

                {

                    "workplace" : "default"

                },

                {

                    "long-running-threads" : "default"

                }

            ],

            "Test-threads" : 45,

            "queue-length" : 70,

            "max-threads" : 70,

            "keepalive-time" : {

                "time" : 20,

                "unit" : "SECONDS"

            }

        },

 

    }]

}

Labels (2)
Tags (1)
0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @SplunkDash,

I hint to try to use spath, because regexes is a very hard way to extract fields, infact you have to create many extractions for each field, e.g. the following:

| rex "\"workplace\" : \"(?<workplace>[^\"]+)\""

Ciao.

Giuseppe

View solution in original post

gcusello
SplunkTrust
SplunkTrust

Hi @SplunkDash,

you don't ned to extract inline, you could also extract and save extraction, but try to use spath.

Ciao.

Giuseppe

0 Karma

SplunkDash
Motivator

Hello @gcusello,

How can I save the extraction for later use without defining it as inline or transformation or macro?  

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @SplunkDash,

save it as a new field extraction.

Ciao.

Giuseppe

0 Karma

SplunkDash
Motivator

@gcusello 

Thank you so much again. But from the web interface (UI), I can see only 2 ways we can save field extraction INLINE and Transformations.  If I use +Extract New Fields or Extract Fields, it goes to INLINE (regular expression) or Field Transformations (Delimiters) option. No where is allowed me to use spath. Is there anything I am missing? Thank you so much again.

0 Karma

SplunkDash
Motivator

Hello @gcusello,

Thank you so much again. But from Web interface (UI), I can see only 2 ways we can save field extraction INLINE or Transformations. If I use the option at the bottom of the left most column (where fields are  listing) "+Extract New Fields", still not letting me to use spath option like as follow. Is there anything I am missing here? Please guide me if possible. Thank you!

| rex field=_raw "\d+-\d+-\d+ \d+:\d+:\d+ - (?<_raw>[\S\s]+)"
| spath

 

 

0 Karma
Get Updates on the Splunk Community!

Index This | Forward, I’m heavy; backward, I’m not. What am I?

April 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...

A Guide To Cloud Migration Success

As enterprises’ rapid expansion to the cloud continues, IT leaders are continuously looking for ways to focus ...

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...