Getting Data In

How to ignore the import data

pansplunktest
New Member

Hi,

I using the external data source named: firewall and I want to ignore the data

"Apr  2 16:06:15 firewall device_id=abcde1234  [Root]system-critical-00033: Src IP session limit! From a.b.c.d to i.j.k.l, proto 1 (zone Trust int  ethernet0/3). Occurred 2 times. (2013-04-02 16:06:14)"

which content "From a.b.c.d"

I tried to config "props.conf"

[source::firewall]
TRANSFORMS-null = setnull

[setnull]
REGEX = From\sa.b.c.d
DEST_KEY = queue
FORMAT = nullQueue

But is shown the warning "

Possible typo in stanza [setnull] in /opt/splunk/etc/system/local/props.conf, line 5: REGEX  =  From\sa.b.c.d
Possible typo in stanza [setnull] in /opt/splunk/etc/system/local/props.conf, line 6: DEST_KEY  =  queue"
Possible typo in stanza [setnull] in /opt/splunk/etc/system/local/props.conf, line 7: FORMAT  =  nullQueue

Is it how I modify to correct config? Thanks in advance.

Tags (1)
0 Karma

pansplunktest
New Member

Thanks. That is missing to creating tranforms.conf

0 Karma

Ayn
Legend

The setnull transform should go into transforms.conf, not props.conf. Read the docs, here: http://docs.splunk.com/Documentation/Splunk/5.0.2/Deploy/Routeandfilterdatad#Discard_specific_events...

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

Index This | What travels the world but is also stuck in place?

April 2026 Edition  Hayyy Splunk Education Enthusiasts and the Eternally Curious!   We’re back with this ...

Discover New Use Cases: Unlock Greater Value from Your Existing Splunk Data

Realizing the full potential of your Splunk investment requires more than just understanding current usage; it ...

Continue Your Journey: Join Session 2 of the Data Management and Federation Bootcamp ...

As data volumes continue to grow and environments become more distributed, managing and optimizing data ...