Getting Data In

How to ingest a selection of JSON fields?

PeraltaRH
Explorer

I have a dump.json file that collects events in JSON format:
{"key":"value","key":"value","key":"value","key":"value"....}

I have no problem processing it however each line has 400 Keys and I only need 30 of them in splunk.

How can I tell the Universal forwarder to only send those 30 fields to my Indexers?
Ingesting all the 400 fields consumes a lot of resources and license.

Labels (2)
Tags (2)
0 Karma

isoutamo
SplunkTrust
SplunkTrust
Hi

With UF I suppose that the easiest way is use modify the producer so that it writes only those needed events on that json. I suppose that there is some program which continuously writes this file one event per line?

With HF, you could use props and transforms for get ride of unwanted values.

r. Ismo
0 Karma
Get Updates on the Splunk Community!

Splunk MCP & Agentic AI: Machine Data Without Limits

  Discover how the Splunk Model Context Protocol (MCP) Server can revolutionize the way your organization ...

Finding Based Detections General Availability

Overview  We’ve come a long way, folks, but here in Enterprise Security 8.4 I’m happy to announce Finding ...

Get Your Hands Dirty (and Your Shoes Comfy): The Splunk Experience

Hands-On Learning and Technical Seminars  Sometimes, you just need to see the code. For those looking for a ...