Splunk Search

Blocking messages because license is exceeded (Sidewinder Firewall)

Michael_Schyma1
Contributor

The message below is the events coming through on our SideWinder Firewalls (debug messages). I am trying to filter out these messages by t_debug because they are in 100% of these messages. Does anyone have any suggestions or ideas of how to get these events blocked.

  Nov  8 08:26:07 web.com Nov  8 08:26:06 iwalld auditd: date="Nov  8 13:26:06 2012 UTC",fac=f_acld,area=a_server,type=t_debug,pri=p_minor,pid=2978,ruid=0,euid=0,pgid=2978,logid=0,cmd=acld,domain=Acld,edomain=Acld,hostname=web.com,information="+|acld|DEBUG|MINOR|ACLD|SERVER=DNS unable to resolve 57.220.179.124: Unknown resolver error"

Within my transforms.conf i created a stanza to send the data to null queue:

[setnullSidewinderFirewall]
REGEX= type=t_debug
DEST_KEY = queue
FORMAT = nullQueue

Within my props.conf:

[SidewinderFirewall]
TRANSFORMS-null=setnullSidewinderFirewall

Thank you guys

Tags (1)
0 Karma

alacercogitatus
SplunkTrust
SplunkTrust

The syntax looks basically correct, I'm wondering if the Regex isn't working correctly.

What happens when you set REGEX = t.debug ?

Also, I believe this requires a restart to take effect.

Ref:
http://docs.splunk.com/Documentation/Splunk/5.0/Deploy/Routeandfilterdatad#Filter_event_data_and_sen...

0 Karma

Michael_Schyma1
Contributor

I have restarted splunk, and when i make that correction or change to the regex, it still does not stop the events from coming through.

0 Karma
Get Updates on the Splunk Community!

Stay Connected: Your Guide to January Tech Talks, Office Hours, and Webinars!

What are Community Office Hours? Community Office Hours is an interactive 60-minute Zoom series where ...

[Puzzles] Solve, Learn, Repeat: Reprocessing XML into Fixed-Length Events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...