Getting Data In

Split a nested json array with key/value pairs

shakSplunk
Path Finder

Hi all,

Im trying to manually upload the following JSON file into splunk enterprise however its producing one event instead of creating 4, one for each timestamp.

{
    "Rows": [
        {
            "timestamp": "03-06-2021 13:52:34",
            "Region": "rcc",
            "Hostname": "lx206",
            "Version": "123",
            "Environment": "E"
        },
        {
            "timestamp": "03-06-2021 13:52:33",
            "Region": "rcc",
            "Hostname": "lx206",
            "Version": "123",
            "Environment": "E"
        },
        {
            "timestamp": "03-06-2021 13:52:32",
            "Region": "rcc",
            "Hostname": "lx206",
            "Version": "123",
            "Environment": "S"
        },
        {
            "timestamp": "03-06-2021 13:52:31",
            "Region": "rcc",
            "Hostname": "lx206",
            "Version": "123",
            "Catridge": "UPP",
            "CatridgeType": "Product",
            "Environment": "S"
        }
    ]
}

The following is my props.config file:

[simpleOutputVersion2]
DATETIME_CONFIG = 
INDEXED_EXTRACTIONS = json
KV_MODE = none
LINE_BREAKER = ([\r\n]+)
NO_BINARY_CHECK = true
category = Structured
description = JavaScript Object Notation format. For more information, visit http://json.org/
disabled = false
pulldown_type = true
TIMESTAMP_FIELDS = Rows{}.timestamp
TIME_FORMAT = %d-%m-%Y %H:%M:%S

The json input is a file and not an api reeponse. Also are there considerations I should make for truncation, adding a TRUNCATE=0 tag as well? 

Any help would be highly appreciated.

Labels (1)
0 Karma
1 Solution

kamlesh_vaghela
SplunkTrust
SplunkTrust

@shakSplunk 

Can you please try this?

 

[ Test123 ]
CHARSET=AUTO
LINE_BREAKER=}(,){\"timestamp\"
NO_BINARY_CHECK=true
SEDCMD-a=s/{"Rows": \[//g
SEDCMD-b=s/\]}//g
SHOULD_LINEMERGE=false

 

Screenshot 2021-06-03 at 2.24.27 PM.png

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

View solution in original post

kamlesh_vaghela
SplunkTrust
SplunkTrust

@shakSplunk 

Can you please try this?

 

[ Test123 ]
CHARSET=AUTO
LINE_BREAKER=}(,){\"timestamp\"
NO_BINARY_CHECK=true
SEDCMD-a=s/{"Rows": \[//g
SEDCMD-b=s/\]}//g
SHOULD_LINEMERGE=false

 

Screenshot 2021-06-03 at 2.24.27 PM.png

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

shakSplunk
Path Finder

Hi @kamlesh_vaghela ,

Sorry one more request, do you mind explaining how this actually works?

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@shakSplunk 

In simple word, we just removed unwanted character from incoming event and provided line breaker. At this point we have valid json with timestamp and this enough for Splunk with inbuilt capability of timestamp mapping  in our use case.  😛 

For more information, please refer below link. 

https://wiki.splunk.com/Community:HowIndexingWorks

If my answer resolved your issue then accept the answer for Community. 

In case  any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

Thanks
KV
▄︻̷̿┻̿═━一


shakSplunk
Path Finder

Thanks for the explanation! One more question @kamlesh_vaghela 

What are the impacts if this is a large json file. I've seen that I may potentially have to use the TRUNCATE=0 config line. Is that true/ is  the correct solution to a large character size in in the json file?

0 Karma

shakSplunk
Path Finder

Hi @kamlesh_vaghela 

Thanks for the help - however unfortunately it didn't work on my side. 

What I did was go on the web UI of splunk enterprise I've selected Settings > Source Types  then edited my source type with the suggested field inputs. The remaining fields popped up by default when going to click save. 

I've gone through the web UI as when I edit the props.conf file located in Splunk/etc/system/local and hit save, it doesn't get reflected on the web instance of splunk even when I refresh the page, however when I make an update through the web UI, it is reflected in the file. 

SplunkIssue.PNG

0 Karma

kamlesh_vaghela
SplunkTrust
SplunkTrust

@shakSplunk 

The new configuration will apply on new coming data.  

So what I suggest to validate this configuration. Create A.json file with sample events and upload to index in your local instance.  Please go through below link for same .

 

https://docs.splunk.com/Documentation/Splunk/8.2.0/SearchTutorial/GetthetutorialdataintoSplunkhttps:...

 

One more thing, FYI I have tried on below _raw event.

 

{"Rows": [{"timestamp": "03-06-2021 13:52:34","Region": "rcc","Hostname": "lx206","Version": "123","Environment": "E"},{"timestamp": "03-06-2021 13:52:33","Region": "rcc","Hostname": "lx206","Version": "123","Environment": "E"},{"timestamp": "03-06-2021 13:52:32","Region": "rcc","Hostname": "lx206","Version": "123","Environment": "S"},{"timestamp": "03-06-2021 13:52:31","Region": "rcc","Hostname": "lx206","Version": "123","Catridge": "UPP","CatridgeType": "Product","Environment": "S"}]}

 

Please let me know if you have multiline or any other type of event.

 

Thanks
KV
▄︻̷̿┻̿═━一

If any of my reply helps you to solve the problem Or gain knowledge, an upvote would be appreciated.

 

Tags (2)

shakSplunk
Path Finder

Thank you so much @kamlesh_vaghela , realised the problem was that the input file was formatted json with line breaks and not in raw form. Thats why it was giving splunk issues. Now its all working thanks!

0 Karma
Get Updates on the Splunk Community!

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...

What’s New in Splunk Security Essentials 3.8.0?

Splunk Security Essentials (SSE) is an app that can amplify the power of your existing Splunk Cloud Platform, ...

Let’s Get You Certified – Vegas-Style at .conf24

Are you ready to level up your Splunk game? Then, let’s get you certified live at .conf24 – our annual user ...