Splunk Search

Difficult extracting fields

jcioffari
Explorer

I have events that look like this and I am using the field extractor 

 

"timestamp": "2020-12-09T18:05:03.6664112Z",

"scopeType": "organization", "scopeDisplayName": "1D (Organization)",

"scopeId": "920941ec-025f-4d4c-9944-e7d357de7d94",

"actionId": "Deleted",

"data": {

"ProjectName": "ATI Libs",

"RepoId": "eb1e2a37-0833-462a-b3e6-031aa1d1f006",

"RepoName": "libs-01"

},

I tried to extract fields using both delimited option ":" as well as using regex.  When I use delimiter of "," it creates the first field 'timestamp' correctly but then lumps everything after that into a single field.   When I try to use regex to extract a field, for example I highlight the value "ATI Libs", I get this error:  

"The extraction failed. If you are extracting multiple fields, try removing one or more fields. Start with extractions that are embedded within longer text strings."

Please advise, thanks.

Labels (1)
0 Karma

somesoni2
Revered Legend

You might need a field transform to handle this field extraction. Have a read at this Splunk documentation:

https://docs.splunk.com/Documentation/Splunk/8.1.0/Knowledge/Exampleconfigurationsusingfieldtransfor...

 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

This looks a bit like JSON. Is this part of a larger event? Could you use spath to extract the fields? Also, are you using max_match=0 option in your rex command to extract multiple fields?

0 Karma

jcioffari
Explorer

I've tried spath but not seeing fields getting extracted properly.

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...