I need to filter events when they contain an id from a defined set.
I know that Heavy Forwarders can filter events based on a regex, but since my list of identifiers changes each day I will need to frequently update the configuration file containing the regex and then restart the forwarder to pick up the change.
Is there a more dynamic way to filter events or is using a regex the only option?
one approach could be the following: Create a file named filter.txt which contains all your filter strings. Next, define a new data input > file, for that file. Here, each line consists of a filter string:
(make sure the whole file is beeing monitored as one event)
After this you could use
join() to filter. E.g. like this:
index="your_source_index" | join your_filter_field [search index="your_metaData_index" | head 1 | rex max_match=0 "(?<your_filter_field>.+)" | mvexpand your_filter_field | table your_filter_field]
I am also quite certain
inputlookup will work. It does. You have to add a new lookup file and lookup. Moreover you have to update that lookup file on your server in some way. A search could look like this:
index="your_source_index" | join your_filter_field [ inputlookup your_lookup | rename csv_header AS your_filter_field]
Latter solution has the back draw of getting access to the lookup file.
Thanks for the suggestion - however this is applying filtering at search time. For performance reasons I need to filter out events so they are not sent to the index.
One little note: You can enable configurations changes made to transforms.conf by typing the following search in Splunk Web: "| extract reload=t"
(source: documentation of transform.conf).
Since you are writing:
"(...) then restart the forwarder to pick up the change."