Splunk Search

How to do Field extraction from Rex?

Sangamesh
Explorer

I need to extract the values between >>>>||  ||  and after the >>>>|| || referring the below sample and output should be like

values between>>>>||1407|| should be temp=1407

values after >>>>||1407|| should be message=[POD CleanUp] File deleted from POD : /dfgd/dfgdfgdfg.dat

Here is the sample log:

{"source":"fdgdfdfg","log":"2023-08-21 04:07:12.400 INFO 42 --- [dfgdf] c.j.t.f.dgf.dfgd.dgf : >>>>||1407|| [POD CleanUp] File deleted from POD : /dfgd/dfgdfgdfg.dat","host":"xx-ret353.svr.gg.fghs.net","tags":["_dateparsefailure"],"@version":"1","Kubernetes.pod":"gkp-xcs-services-black-prd-67986d784-b6c5j","s_sourcetype":"tyu","@timestamp":"2023-08-21T08:07:28.420Z","Kubernetes.namespace":"80578d64606-56-fyt-ty-prod","appId":"1235","app_id":"2345","log_file":"/app/logs/app.log","Kubernetes.node":"sd-1564sw32b0f.svr.us.sdf.net"}

@ITWhisperer

 

Labels (1)
Tags (2)
0 Karma
1 Solution

ITWhisperer
SplunkTrust
SplunkTrust

Given that this looks like JSON, you should either already have these fields if you have ingested the log correctly, or you could use spath to extract them. If you want to continue with rex for these fields, try this:

| rex "timestamp\":\"(?<timestamp>[^\"]+)"
| rex "Kubernetes.pod\":\"(?<kubernetes_pod>[^\"]+)"

View solution in original post

ITWhisperer
SplunkTrust
SplunkTrust

This looks like JSON, so assuming you have already extract the log field, try this

| rex field=log ">>>>\|\|(?<temp>[^\|]+)\|\|\s(?<message>.+)"

Sangamesh
Explorer

I see couple of logs starting with this log format too <><><><>||1407||

could you please provide the Rex expression with already provided solution @ITWhisperer 

0 Karma

Sangamesh
Explorer

@ITWhisperer @Will you able to provide the Rex for the below log format too.

<><><><>||1407||

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust
| rex field=log "\<>\<>\<>\<>\|\|(?<temp>[^\|]+)\|\|\s(?<message>.+)"
0 Karma

Sangamesh
Explorer

@ITWhisperer @Whatever you provided rex expression is not fetching the values 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Please share the complete event which is not working for you (anonymised of course). Please use a code block </> so the formatting and special characters are preserved.

0 Karma

Sangamesh
Explorer

How to extract these fields timestamp,kubernetes.pod too along with the below provided solutions 

@ITWhisperer 

0 Karma

ITWhisperer
SplunkTrust
SplunkTrust

Given that this looks like JSON, you should either already have these fields if you have ingested the log correctly, or you could use spath to extract them. If you want to continue with rex for these fields, try this:

| rex "timestamp\":\"(?<timestamp>[^\"]+)"
| rex "Kubernetes.pod\":\"(?<kubernetes_pod>[^\"]+)"

isoutamo
SplunkTrust
SplunkTrust

Hi

one way to do it use separate rex expressions. Then it's not dependent on order of those values in your log message. If you could be sure that order is always same then you can add all in one or to rex. As you have json (based on your examples) you could also use extract/kv command to extract those fields like json.

 

...
| rex "timestamp\":\"(?<timestamp>\d{4}-\d\d-\d\dT\d\d:\d\d:\d\d.\d{3}[^\"]+)"
| rex "Kubernetes\.pod\":\"(?<kubernets_pod>[^\"]+)"

 

r. Ismo 

Added missed ) for 1st one.

0 Karma

Sangamesh
Explorer

@isoutamo @Whatever you provided solution not extracting the timestamp and Kubernetes.pod

0 Karma

isoutamo
SplunkTrust
SplunkTrust

Based on your example this should works after I fix/add missed ) on timestamp part.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In November, the Splunk Threat Research Team had one release of new security content via the Enterprise ...

Index This | Divide 100 by half. What do you get?

November 2024 Edition Hayyy Splunk Education Enthusiasts and the Eternally Curious!  We’re back with this ...

Stay Connected: Your Guide to December Tech Talks, Office Hours, and Webinars!

❄️ Celebrate the season with our December lineup of Community Office Hours, Tech Talks, and Webinars! ...