Getting Data In

Converting large JSON response into smaller events and combine objects

david_rose
Communicator

I have a large JSON response that I need to split into smaller events as well as merge data from different points within the JSON into the single events. The JSON responses can be millions of lines, hence the need to split them into events.

Here is a small sample of a response. Real data has thousands of ip_address values:

{
    "policy_name": "1_External_SCAN_All_External_Subnets",
    "policy_type": "external",
    "fast_scan": false,
    "execution_date": 1413114210,
    "execution_time": 20220,
    "scan_result": "completed",
    "scan_summary": {
        "skipped": 0,
        "partial": 0,
        "scanned": 0,
        "rollover": 0
    },
    "errors": [],
    "total_hosts": 2144,
    "vulnerable_hosts": [
        {
            "ip_address": "127.0.0.1",
            "vulns": [
                {
                    "title": "TLS 1.0 & TLS 1.1 Weak Encryption Protocol",
                    "risk_level": "Medium",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "SSL - Server Supports Weak SSL Ciphers",
                    "risk_level": "Medium",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "CVE-2016-2183 - DES, Triple DES - Sweet32 Issue",
                    "risk_level": "Medium",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "SSL - Certificate Hostname Discrepancy",
                    "risk_level": "Medium",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "CVE-2013-2566 - RC4 - Plaintext-Recovery Issue",
                    "risk_level": "Medium",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "Web Service is Running",
                    "risk_level": "Low",
                    "service_protocol": "TCP",
                    "service_port": 80
                },
                {
                    "title": "Web Service is Running",
                    "risk_level": "Low",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "SSL Protocol - BEAST Attack - Server-Side Mitigation",
                    "risk_level": "Low",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "3DES Detected",
                    "risk_level": "Low",
                    "service_protocol": "TCP",
                    "service_port": 443
                },
                {
                    "title": "Web Service is Running",
                    "risk_level": "Low",
                    "service_protocol": "TCP",
                    "service_port": 8045
                }
            ]
        },
    ],
    "total_vulns": 7171,
    "vulns_summary": {
        "Critical": 0,
        "High": 296,
        "Low": 4345,
        "Medium": 2530,
        "Unclassified": 0,
        "Urgent": 0
    }
}

Each Vulns entry needs to be a seperate event. I also need the ip_address, policy_name, and execution date added to each event. I can split the events fine, but I have no idea how to attach the 3 extra fields to the events.

Sample event structure would look like:

"policy_name": "1_External_SCAN_All_External_Subnets"
"execution_date": 1413114210
"ip_address": "127.0.0.1"
"title": "TLS 1.0 & TLS 1.1 Weak Encryption Protocol",
"risk_level": "Medium",
"service_protocol": "TCP",
"service_port": 443
Tags (2)
0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...