Splunk Search

How do I auto extract fields from JSON event?

comcordriro
Explorer

Hi there after much searching and testing i feel i'm stuck. Or even unsure what i want is possible. 

What i want

I have _json data indexed. Each event is a long array. I want Splunk to automatically make key:value pairs per value. Until now, Splunk gives me all the values instead of 1 single value. Also it seems Splunk can't make correlations between fields. 

I want to use fields so i can do simple searches, like making a table for "internal" "website_url"s and their  status ("up" or "down"). 

 

Example event

{"data":[{"id":"1234567","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234567","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234562","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456","123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234563","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234564","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":[],"status":"up","tags":["internal"],"uptime":100},{"id":"1234567","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234562","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234560","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234562","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234568","paused":false,"name":"adyen","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external"],"uptime":100},{"id":"1234567","paused":false,"name":"paynl","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external"],"uptime":100},{"id":"1234562","paused":false,"name":"trustpay","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external"],"uptime":100},{"id":"1234563","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external"],"uptime":100},{"id":"1234566","paused":false,"name":"spryng","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external","sms gateway"],"uptime":100},{"id":"1234568","paused":false,"name":"messagebird","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external"],"uptime":100},{"id":"1234567","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234563","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external"],"uptime":100},{"id":"1234564","paused":false,"name":"mitek","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":["external"],"uptime":100},{"id":"1234566","paused":false,"name":"bitstamp","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":[external"],"uptime":100},{"id":"1234560","paused":false,"name":"kraken","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":3600,"contact_groups":[],"status":"up","tags":["external"],"uptime":100},{"id":"1234569","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100},{"id":"1234567","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":[],"uptime":100},{"id":"1234567","paused":false,"name":"Blox login","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":300,"contact_groups":["123456"],"status":"up","tags":[],"uptime":100},{"id":"1234567","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":[],"uptime":100},{"id":"1234564","paused":false,"name":"https:\/\/some.random.url\/with\/random\/string","website_url":"https:\/\/some.random.url\/with\/random\/string","test_type":"HTTP","check_rate":60,"contact_groups":["123456"],"status":"up","tags":["internal"],"uptime":100}],"metadata":{"page":1,"per_page":25,"page_count":2,"total_count":26}}

 

How far i got

 

source="/opt/splunk/etc/apps/randomname/bin/statuscake_api.sh" | spath output=id path=data{}.id | spath output=url path=data{}.website_url | spath output=status path=data{}.status | search id=8179640 | table id, url, status

 However, it shows a table of all aray fields, not just one specific 'id' i specified in the search part | search id=<idnumber>

 

Screenshot

 

Labels (1)
Tags (2)
0 Karma
1 Solution

comcordriro
Explorer

Guess i used the wrong commands. 

Solution:

 

 

earliest=-1min source="/opt/splunk/etc/apps/appname/bin/scriptname.sh" | stats values(data{}.status) as status, latest(data{}.uptime) as uptime by data{}.website_url

 

 

View solution in original post

0 Karma

comcordriro
Explorer

Guess i used the wrong commands. 

Solution:

 

 

earliest=-1min source="/opt/splunk/etc/apps/appname/bin/scriptname.sh" | stats values(data{}.status) as status, latest(data{}.uptime) as uptime by data{}.website_url

 

 

0 Karma

comcordriro
Explorer

Okay, because editing is not working, here the screenshot it should've attached.

Here you can see Splunk returns 1 row with all id's instead of 1 row with 1 specific id.

 

The dirty way i would manage to get what i want is line breaking during index time. Where each ID would be a new event. And then do a transform and fieldextraction. But i feel like thats not the proper route as so many people use json input, i cannot imagine one should do this extensive setup for each json input.

0 Karma
Get Updates on the Splunk Community!

Why You Can't Miss .conf25: Unleashing the Power of Agentic AI with Splunk & Cisco

The Defining Technology Movement of Our Lifetime The advent of agentic AI is arguably the defining technology ...

Deep Dive into Federated Analytics: Unlocking the Full Power of Your Security Data

In today’s complex digital landscape, security teams face increasing pressure to protect sprawling data across ...

Your summer travels continue with new course releases

Summer in the Northern hemisphere is in full swing, and is often a time to travel and explore. If your summer ...