Getting Data In

How can I extrac fields depending on the type of event?

evinasco08
Explorer

Hi,

if I had logs as such wirn different type data in the same sourcetype:

"<134>Nov 23 21:23:17 NSX-edge-7-0 loadbalancer[2196]: [default]: 154545"
 

"<4>Nov 23 21:06:47 NSX-edge-7-0 firewall[]: [default]: ACCEPT"

How can I extract thew value after "[default]: " without extract null values????

For example, if in the first event I created a field called "FIELDA=154545", i dont want the value in the second event it to be "ACCEPT", I need to create second field called "FIELDB=ACCEPT"

I hope to have made me understand 

Regards,

Labels (1)
0 Karma
1 Solution

bowesmana
SplunkTrust
SplunkTrust

Create a field transformation with this

FORMAT = $1::$2
REGEX = NSX-edge-\d+-\d+\s+([^\[]*).*\[default\]: (.*)

and then a field extraction using that transformation

Note that this is looking for fixed text 'NSX-edge' followed by the digit pattern, but this will extract field names loadbalancer, config and firewall with the associated field following default

View solution in original post

bowesmana
SplunkTrust
SplunkTrust

These rex statements will do it

| rex field=x "\[default]: (?<FIELDA>\d+)"
| rex field=x "\[default]: (?<FIELDB>[A-Z]+)"

but they are simple in that the first for FIELDA looks for only 1 or more digits and the second only looks for upper case characters A-Z.

 

0 Karma

evinasco08
Explorer

thanks for u help..

So, if the value in FIELDA no always are digits?

0 Karma

bowesmana
SplunkTrust
SplunkTrust

You need to define the rules and then you can define the regex.

What are the different permutations of the different values that follow [default]

0 Karma

evinasco08
Explorer

Hi,

there are four event types , I am gonna share these examples, 

<134>Nov 24 14:09:52 NSX-edge-7-0 loadbalancer[2196]: [default]: 192.168.0.12:53184 [24/Nov/2022:14:09:52.006] CMP_RP_virtualserver CMP_RP_Pool/cmp_rp1_member 1/0/12 3132 -- 4/4/3/3/0 0/0
<28>Nov 24 14:09:00 NSX-edge-7-0 config[]: [default]: WARN :: C_UTILS :: File /var/db/networkmonitor/monitor_status.dat not exist
<30>Nov 24 14:09:00 NSX-edge-7-0 config[]: [default]: INFO :: loadbalancer stats :: member stats.pool:CMP_RP_Pool,member:cmp_rp2_member,ip:172.xx.xx.x,port:xxx,status:1,vip:CMP_RP_virtualserver
<4>Nov 24 14:09:56 NSX-edge-7-0 firewall[]: [default]: ACCEPT_131091IN= OUT=vNic_0 src=10.2.xx.xx DST=172.xx.xx.xx LEN=62 TOS=0x00 PREC=0x00 TTL=63 ID=31370 DF PROTO=UDP SPT=xx818 DPT=xx LEN=xx

 

As you can see,  before [default], there are differents values can be  a differentiator.

loadbalancer[2196]

config[]

firewall[]

could I use that for doing rules?

 

 

Tags (1)
0 Karma

bowesmana
SplunkTrust
SplunkTrust

If those are the differentiations, then you could do this assuming your data is in _raw field

| rex "loadbalancer.*\[default\]: (?<lb_field>.*)"
| rex "config.*\[default\]: (?<cnf_field>.*)"
| rex "firewall.*\[default\]: (?<fw_field>.*)"

 that would create 3 different field names.

You could also do it a little bit differently with

| rex "(?<fn>(firewall|loadbalancer|config)).*\[default\]: (?<val>.*)"
| eval {fn}=val
| fields - fn val

i.e. with a single rex line which extracts the name/value and then makes a field named the same as the text in the first part of the row.

Another more flexible alternative is this

| rex "NSX-edge-\d+-\d+\s+(?<fn>[^\[]*).*\[default\]: (?<val>.*)"
| eval {fn}=val
| fields - fn val

where it will now take ANY category of name from the text after the NSX message up to the opening [ and then the value after [default]

0 Karma

evinasco08
Explorer

Thanks for u help, Do u know I how integrate through "trasnforms.com"  by rules?

0 Karma

bowesmana
SplunkTrust
SplunkTrust

Create a field transformation with this

FORMAT = $1::$2
REGEX = NSX-edge-\d+-\d+\s+([^\[]*).*\[default\]: (.*)

and then a field extraction using that transformation

Note that this is looking for fixed text 'NSX-edge' followed by the digit pattern, but this will extract field names loadbalancer, config and firewall with the associated field following default

Get Updates on the Splunk Community!

Using Machine Learning for Hunting Security Threats

WATCH NOW Seeing the exponential hike in global cyber threat spectrum, organizations are now striving more for ...

New Learning Videos on Topics Most Requested by You! Plus This Month’s New Splunk ...

Splunk Lantern is a customer success center that provides advice from Splunk experts on valuable data ...

How I Instrumented a Rust Application Without Knowing Rust

As a technical writer, I often have to edit or create code snippets for Splunk's distributions of ...