Splunk Search

filter logs containing a specific string in username field so that they won't transfer to heavy forwarder from indexer using transforms.conf?

pavanae
Builder

I have filter applied in transforms.conf as follows

[send_to_heavy_forwarder]
CAN_OPTIMIZE = True
CLEAN_KEYS = True
DEFAULT_VALUE = 
DEST_KEY = _TCP_ROUTING
FORMAT = heavy_forwarder
KEEP_EMPTY_VALS = False
LOOKAHEAD = 4096
MATCH_LIMIT = 100000
MV_ADD = False
RECURSION_LIMIT = 1000
REGEX = (logtype::ABC.*id::IDB-28123.*username::((?!-TEST).)*$)
SOURCE_KEY = _meta
WRITE_META = False

All I'm trying here is to filter sending logs If the following conditions satisfies

logtype=ABC, id=IDB-28123 and username value doesn't end with TEST

which is not working but it is working if I removed the username part in the regex.

It's not working before due to the negative look back I applies. If so, how can I filter those test user logs?

Any help would be great.

Check below for more details :-

Logs in Indexer :-

    pppppppppppp logtype::ABC id::IDB-28123  pppppppp username::ppppppp
    pppppppppppp logtype::ABC id::IDB-28123 pppppppp username::qqq
    pppppppppppp logtype::ABC id::IDB-28123 pppppppp username::rrTEST

Regex should skip sending the below logs :-

pppppppppppp logtype::ABC id::IDB-28123 pppppppp username::rrTEST

Regex should send the below logs :-

 pppppppppppp logtype::ABC id::IDB-28123  pppppppp username::ppppppp
        pppppppppppp logtype::ABC id::IDB-28123 pppppppp username::qqq
0 Karma
1 Solution

rmjharris
Path Finder

It's just the "-" in your negative lookahead. Currently you will discard anything that ends with "-TEST". You don't have a "-" in your example. This should work:

REGEX = (logtype::ABC.*id::IDB-28123.*username::((?!TEST).)*$)

Edit: I don't know why two of the "*" weren't showing up.

View solution in original post

0 Karma

rmjharris
Path Finder

It's just the "-" in your negative lookahead. Currently you will discard anything that ends with "-TEST". You don't have a "-" in your example. This should work:

REGEX = (logtype::ABC.*id::IDB-28123.*username::((?!TEST).)*$)

Edit: I don't know why two of the "*" weren't showing up.

0 Karma

pavanae
Builder

Thanks @rmjharris. what if I have "-" at the end of some username fields like below

username=abcd-TEST

does your answer still works for the above example too

0 Karma

rmjharris
Path Finder

It will work if "TEST" (case sensitive) is in the username. Any of these
user-TEST
userTEST
TESTuser
usTESTer

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi
If you could share an example of your logs it could be easier for me to check the regex to filter your logs!

Anyway in the REGEX option, you have to insert the exact regex for filtering your logs, so if your logs are something like these

pppppppppppp logtype::ABC.123 pppppppid::IDB-28123.ppppp  pppppppp username::ppppppp
pppppppppppp logtype::ABC.123 qqqqqid::IDB-28123.qqqq pppppppp username::qqq
pppppppppppp logtype::ABC.123 rrrrrrid::IDB-28123.rrr pppppppp username::rrTEST

You could use a regex like this one (that you can test in https://regex101.com/r/D5HhNZ/1

logtype::[^ ]*\s+\w+id::IDB-28123.\w+\s\w+\s+username::\w+TEST

that filter only the last event.

Bye.
Giuseppe

0 Karma

pavanae
Builder

Thanks for the response @gcusello. Here I want to skip the logs which has the string "TEST" at the end of the username field. The regex you provided Just doing the opposite. On your regex example It should select the remaining except the log with username which has string "TEST" at the end.

0 Karma

harsmarvania57
Ultra Champion

Hi,

Is it possible you to provide some sample raw data (Please mask any sensitive data) ?

0 Karma
Get Updates on the Splunk Community!

.conf24 | Registration Open!

Hello, hello! I come bearing good news: Registration for .conf24 is now open!   conf is Splunk’s rad annual ...

ICYMI - Check out the latest releases of Splunk Edge Processor

Splunk is pleased to announce the latest enhancements to Splunk Edge Processor.  HEC Receiver authorization ...

Introducing the 2024 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...