Splunk Search

extract fields

afeng
New Member
Oct 22 14:20:45 10.5.0.200 DNAC {"version":"1.0.0","instanceId":"20fd8163-4ca8-424b-a5a9-1e4018372abb","eventId":"AUDIT_LOG_EVENT","namespace":"AUDIT_LOG","name":"AUDIT_LOG","description":"Executing command terminal width 0\nconfig t\nFailed to fetch the preview commands.\n","type":"AUDIT_LOG","category":"INFO","domain":"Audit","subDomain":"","severity":1,"source":"NA","timestamp":1729606845043,"details":{"requestPayloadDescriptor":"terminal width 0\nconfig t\nFailed to fetch the preview commands.\n","requestPayload":"\n"},"ciscoDnaEventLink":null,"note":null,"tntId":"630db6e989269c11640abd49","context":null,"userId":"system","i18n":null,"eventHierarchy":{"hierarchy":"20fd8163-4ca8-424b-a5a9-1e4018372abb","hierarchyDelimiter":"."},"message":null,"messageParams":null,"additionalDetails":{"eventMetadata":{"auditLogMetadata":{"type":"CLI","version":"1.0.0"}}},"parentInstanceId":"9dde297d-845e-40d0-aeb0-a11e141f95b5","network":{"siteId":"","deviceId":"10.7.140.2"},"isSimulated":false,"startTime":1729606845055,"dnacIP":"10.5.0.200","tenantId":"SYS0"}

how do I extract : seperated fields?

Labels (1)
0 Karma

yuanliu
SplunkTrust
SplunkTrust

(Note: When giving sample data, use the code box.)  Your log mixes plain text with structured JSON.  So, the first task is to extract the JSON piece, then extract from JSON using spath.

 

| rex "DNAC (?<json_msg>{.+})"
| spath input=json_msg

 

description from your sample data will contain this value

description
Executing command terminal width 0 config t Failed to fetch the preview commands.

Here is an emulation of your sample data.  Play with it and compare with real data

 

| makeresults
| eval _raw = "Oct 22 14:20:45 10.5.0.200 DNAC {\"version\":\"1.0.0\",\"instanceId\":\"20fd8163-4ca8-424b-a5a9-1e4018372abb\",\"eventId\":\"AUDIT_LOG_EVENT\",\"namespace\":\"AUDIT_LOG\",\"name\":\"AUDIT_LOG\",\"description\":\"Executing command terminal width 0\\nconfig t\\nFailed to fetch the preview commands.\\n\",\"type\":\"AUDIT_LOG\",\"category\":\"INFO\",\"domain\":\"Audit\",\"subDomain\":\"\",\"severity\":1,\"source\":\"NA\",\"timestamp\":1729606845043,\"details\":{\"requestPayloadDescriptor\":\"terminal width 0\\nconfig t\\nFailed to fetch the preview commands.\\n\",\"requestPayload\":\"\\n\"},\"ciscoDnaEventLink\":null,\"note\":null,\"tntId\":\"630db6e989269c11640abd49\",\"context\":null,\"userId\":\"system\",\"i18n\":null,\"eventHierarchy\":{\"hierarchy\":\"20fd8163-4ca8-424b-a5a9-1e4018372abb\",\"hierarchyDelimiter\":\".\"},\"message\":null,\"messageParams\":null,\"additionalDetails\":{\"eventMetadata\":{\"auditLogMetadata\":{\"type\":\"CLI\",\"version\":\"1.0.0\"}}},\"parentInstanceId\":\"9dde297d-845e-40d0-aeb0-a11e141f95b5\",\"network\":{\"siteId\":\"\",\"deviceId\":\"10.7.140.2\"},\"isSimulated\":false,\"startTime\":1729606845055,\"dnacIP\":\"10.5.0.200\",\"tenantId\":\"SYS0\"}"
``` data emulation above ```

 

 

Tags (1)
0 Karma

afeng
New Member

I want to extract the 'description' field. it can be for the new messages

0 Karma

inventsekar
SplunkTrust
SplunkTrust

Hi @afeng You want to extract at for the already ingested/existing logs at Splunk indexer(search time) 

or

for the new logs yet to be ingested to splunk (are you using any addons, TA's.. are you using UF and/or HF?)

 

thanks and best regards,
Sekar

PS - If this or any post helped you in any way, pls consider upvoting, thanks for reading !
0 Karma

afeng
New Member

extract the new messages is fine. tried 'Extract New Fields', not easy to work

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In December, the Splunk Threat Research Team had 1 release of new security content via the Enterprise Security ...

Why am I not seeing the finding in Splunk Enterprise Security Analyst Queue?

(This is the first of a series of 2 blogs). Splunk Enterprise Security is a fantastic tool that offers robust ...

Index This | What are the 12 Days of Splunk-mas?

December 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with another ...