Splunk Enterprise

Moving to nullQueue via transform.conf does not work

NoSpaces
Contributor

Have a nice day, everyone!
I came across some unexpected behavior while trying to move some unwanted events to the nullQueue.
I have the sourcetype named 'exch_file_trans-front-recv'.
Events related to this sourcetype are ingested by a universal forwarder with the settings below:

props.conf

 

[exch_file_trans-front-recv]
ANNOTATE_PUNCT = false
FIELD_HEADER_REGEX = ^#Fields:\s+(.*)
SHOULD_LINEMERGE = false
INDEXED_EXTRACTIONS = csv
TIMESTAMP_FIELDS = date_time
BREAK_ONLY_BEFORE_DATE = true
MAX_TIMESTAMP_LOOKAHEAD = 24
initCrcLength = 256
TRANSFORMS-no_column_headers = no_column_headers

 


transforms.conf

 

[no_column_headers]
REGEX = ^#.*
DEST_KEY = queue
FORMAT = nullQueue

 


In this sourcetype I have some events that I want to delete before indexing. You can see an example below:

 

2024-08-22T12:58:31.274Z,Sever01\Domain Infrastructure Sever01,08DCC212EB386972,6,172.25.57.26:25,172.21.255.8:29635,-,,Local

 


So, I'm interested in deleting events with the pattern '...172.21.225.8:....,'.
To do it, I created some settings on the indexer cluster layer:

props.conf

 

[exch_file_trans-front-recv]
TRANSFORMS-remove_trash = exch_file_trans-front-recv_rt0

 


transforms.conf

 

[exch_file_trans-front-recv_rt0]
REGEX = ^.*?,.*?,.*?,.*?,.*?,172.21.255.8:\d+,
DEST_KEY = queue
FORMAT = nullQueue

 


After applying this configuration across the indexer cluster, I still observe new events with the presented pattern.
What am I doing wrong?

Labels (1)
0 Karma
1 Solution

PickleRick
SplunkTrust
SplunkTrust

Ah, right.  I missed the configs from UF. My bad. Could have explained sooner.

When you're using indexed extractions, the data is sent from UF as parsed. And is not processed anymore on components downstream (with a possible exception of index actions). I suppose you want to get rid of the header line(s). You should rather use parameters from https://docs.splunk.com/Documentation/Splunk/Latest/Admin/Propsconf#Structured_Data_Header_Extractio... for this, especially PREAMBLE_REGEX or FIELD_HEADER_REGEX

 

View solution in original post

PickleRick
SplunkTrust
SplunkTrust

1. Where do you put those props/transforms?

2. Do you use indexed extractions?

0 Karma

NoSpaces
Contributor

1. The first pair of props/transforms related to Universal Forwarder
The second pair is putted on the indexer cluster layer

2. Yes, I see indexed extractions

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Ah, right.  I missed the configs from UF. My bad. Could have explained sooner.

When you're using indexed extractions, the data is sent from UF as parsed. And is not processed anymore on components downstream (with a possible exception of index actions). I suppose you want to get rid of the header line(s). You should rather use parameters from https://docs.splunk.com/Documentation/Splunk/Latest/Admin/Propsconf#Structured_Data_Header_Extractio... for this, especially PREAMBLE_REGEX or FIELD_HEADER_REGEX

 

NoSpaces
Contributor

Thank you for your notable comment. I suspected that my configuration didn't work because of indexed extraction. But I hadn't time to check and I wasn't sure about it 😃
Talking about preamble, I tested settings that you mentioned a couple of times, but each time it worked worse than the nullQueue approach
Maybe I just was not enough attentive...)

0 Karma

richgalloway
SplunkTrust
SplunkTrust

If you just want to filter on "*172.21.255.8*" then why do you have all that extra stuff in the regex?

Try this simpler version

REGEX = ,172\.21\.255\.8:\d+,
---
If this reply helps you, Karma would be appreciated.

NoSpaces
Contributor

Thank you for the suggestion. I will try this simpler way.
But I wanted to avoid a possible situation where the given pattern would appear in another place.
For example, if I encounter a pattern after a sixth or seventh comma, it's not my case. 
I'm not sure that this situation can really exist but I don't know how to check 😃

0 Karma
Got questions? Get answers!

Join the Splunk Community Slack to learn, troubleshoot, and make connections with fellow Splunk practitioners in real time!

Meet up IRL or virtually!

Join Splunk User Groups to connect and learn in-person by region or remotely by topic or industry.

Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Character substitutions with Regular Expressions

This challenge was first posted on Slack #puzzles channelFor BORE at .conf23, we had a puzzle question which ...

Splunk Community Badges!

  Hey everyone! Ready to earn some serious bragging rights in the community? Along with our existing badges ...

[Puzzles] Solve, Learn, Repeat: Matching cron expressions

This puzzle (first published here) is based on matching timestamps to cron expressions.All the timestamps ...