Splunk Cloud Platform

Filtration of logs from Particular sourcetypes

anandhalagaras1
Communicator

Hi Team,

We are ingesting Palo Alto Firewall logs into Splunk from our Syslog server. Hence we have made the Syslog server as a Heavy Forwarder as well.

And in the Syslog Heavy Forwarder server we have installed the "Splunk_TA_paloalto" Add-on and configured the inputs as "pan:firewall" so that based on the TA the data is getting segregated with different sourcetypes such as "pan:hipmatch" "pan:userid" , "pan:system" ,"pan:traffic" & "pan:threat".

 

So now we want to filter out the log ingestion from few of the sourcetypes such as "pan:hipmatch", "pan:userid" & "pan:system" alone. So how to filter those logs from Splunk before ingestion.

Where should i need to place the props and transforms and what would be the props and transforms to filter those logs from those particular sourcetypes?

Hence kindly help on my request.

 

Labels (2)
0 Karma
1 Solution

scelikok
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

You should place below props snd transforms configs to the heavy forwarder.  You can play with the regex to filter other events too.

props.conf
[pan:firewall]
TRANSFORMS-filter = setnull

transforms.conf
[setnull]
REGEX = ^[^,]+,[^,]+,[^,]+,HIPMATCH|SYSTEM|USERID,
DEST_KEY = queue
FORMAT = nullQueue

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

View solution in original post

scelikok
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

If you are ingesting logs to two different indexes, you should be using to inputs. You can use the same config that I proposed only by changing the props stanza with source.

As a sample; assuming you are getting pan log via syslog server using file monitor /log/syslog/pan; 

props.conf
[source::/log/syslog/pan]
TRANSFORMS-filter = setnull

transforms.conf
[setnull]
REGEX = ^[^,]+,[^,]+,[^,]+,HIPMATCH|SYSTEM|USERID,
DEST_KEY = queue
FORMAT = nullQueue

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.
0 Karma

scelikok
SplunkTrust
SplunkTrust

Hi @anandhalagaras1,

You should place below props snd transforms configs to the heavy forwarder.  You can play with the regex to filter other events too.

props.conf
[pan:firewall]
TRANSFORMS-filter = setnull

transforms.conf
[setnull]
REGEX = ^[^,]+,[^,]+,[^,]+,HIPMATCH|SYSTEM|USERID,
DEST_KEY = queue
FORMAT = nullQueue

 

If this reply helps you an upvote and "Accept as Solution" is appreciated.

anandhalagaras1
Communicator

@scelikok ,

 

Thank you for your help. And now i have resolved the issue from my end post which the logs are not getting ingested into Splunk for these sourcetypes.

0 Karma

anandhalagaras1
Communicator

@scelikok ,

As you mentioned I have used to filter out the logs based on sourcetype "pan:firewall" and placed it in our heavy forwarder server but still i can see the logs are getting ingested into splunk from pan:userid, pan:hipmatch & pan:system. Is it anything related to case sensitive?

As per the Add-On "Splunk_TA_paloalto" it is mentioned as below in the props.conf:

[pan:firewall]
category = Network & Security
description = Syslog from Palo Alto Networks Next-generation Firewall
pulldown_type = true
SHOULD_LINEMERGE = false
TIME_PREFIX = ^(?:[^,]*,){6}
MAX_TIMESTAMP_LOOKAHEAD = 32
TRANSFORMS-sourcetype = pan_threat, pan_traffic, pan_system, pan_config, pan_hipmatch, pan_correlation, pan_userid, pan_traps4

 

So kindly help me out to filter those logs from Splunk.

props.conf
[pan:firewall]
TRANSFORMS-filter = setnull

transforms.conf
[setnull]
REGEX = ^[^,]+,[^,]+,[^,]+,HIPMATCH|SYSTEM|USERID,
DEST_KEY = queue
FORMAT = nullQueue

 

0 Karma

anandhalagaras1
Communicator

@scelikok ,

So when checked from my end,I can see that the source has been already configured with a props.conf i.e. the Timezone (TZ). So hence I have removed the props which i created with the sourcetype and added the stanza as provided in the source props which already exists and created the transforms.conf as mentioned.

props.conf:
[source::/var/log/remote/0000-Today/logfile/hostname]
TZ = CST6CDT
TRANSFORMS-filter = setnull220


transforms.conf:
[setnull220]
REGEX = ^[^,]+,[^,]+,[^,]+,HIPMATCH|SYSTEM|USERID,
DEST_KEY = queue
FORMAT = nullQueue


But still i can see the logs are getting ingested into Splunk with the following sourcetype pan:userid, pan:system and pan:hipmatch.

The extracted field names for HIPMATCH, SYSTEM and USERID comes under "type".

So kindly let me know where is the exact error & how can i stop ingesting those logs before ingestion into Splunk.

0 Karma

anandhalagaras1
Communicator

@scelikok ,

Also I have tried the regex like mentioned below as well but still the logs are getting ingested into Splunk.

props.conf:
[source::/var/log/remote/0000-Today/logfile/hostname]
TZ = CST6CDT
TRANSFORMS-filter = setnull220

transforms.conf:
[setnull220]
REGEX = (?=USERID|SYSTEM|HIPMATCH) OR REGEX = (USERID|SYSTEM|HIPMATCH)
DEST_KEY = queue
FORMAT = nullQueue

But still the data is not getting filtered out before ingestion. So kindly help me on the same.

0 Karma

anandhalagaras1
Communicator

Hi All,

I tried the props and transforms with sourcetype as "pan:firewall" in the Syslog HF server but it didnt worked. So now I have tried directly with the sourcetype as "pan:userid" with the following stanza but still the logs are getting ingested into Splunk.

I want to get it filtered out before ingestion itself so that we can save few licenses but unfortunately its not working.

This is my latest props and transforms which i used in the Syslog HF server for filtration.

props.conf:

[pan:userid]
TRANSFORMS-filter= setnull_case07


transforms.conf
[setnull_case07]
REGEX = ^[^,]+,[^,]+,[^,]+,USERID,
DEST_KEY = queue
FORMAT = nullQueue

So anyone kindly help on my request.

0 Karma

anandhalagaras1
Communicator

@scelikok 

Thank you for your prompt response. But we have configured the pan:firewall sourcetype for two indexes.

Index= abc & def

So i want to filter out the logs only from index=abc; sourcetype=pan:firewall  so after segregation of sourcetype in that i want to filter the HIPMATCH, SYSTEM and USERID.

 

So kindly help on the same.

 

Tags (1)
0 Karma
Get Updates on the Splunk Community!

Routing logs with Splunk OTel Collector for Kubernetes

The Splunk Distribution of the OpenTelemetry (OTel) Collector is a product that provides a way to ingest ...

Welcome to the Splunk Community!

(view in My Videos) We're so glad you're here! The Splunk Community is place to connect, learn, give back, and ...

Tech Talk | Elevating Digital Service Excellence: The Synergy of Splunk RUM & APM

Elevating Digital Service Excellence: The Synergy of Real User Monitoring and Application Performance ...