Getting Data In

Filtering (discarding) logs using Heavy Forwarder. Regex filter fails after transforms reload

fahmed11
Explorer

I'm using an on-prem Heavy Forwarder to filter some noisy logs coming in via syslog (HF is installed on syslog server). Logs are then forwarded to our Splunk Cloud instances. 

I configured the inputs.conf, props.conf, and transforms.conf using the regex forwarding the garbage to a nullQueue index to drop the necessary traffic. I reloaded the transforms using the "refresh" URL below (without restarting the entire splunkd service described here). This was working perfectly as expected.

http://your-heavy-forwarder-splunk-server:8000/en-GB/debug/refresh

I recently made a change to drop some more logs in a different file. So, changes were made to different inputs, props, and transform config file than the first time. I used the same method to reload the transforms. As soon as I did that, for about 10 to 30 minutes the previous log filter stopped working and tons of garbage started flowing into our Splunk Cloud account (see the crazy bump shown below).

fahmed11_0-1617112670910.png

 

After a while it stopped on its own and the new filter works as expected as well (I'm so confused). However, as you can imagine, this crazy amount of logs flowing into Splunk Cloud every time we want to discard logs is counterintuitive to the whole exercise. 

 

I want to understand if this is a known issue and if there is a way around it.

 

 

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...