Getting Data In

Can you help me send events to null queue from a farm of heavy forwarders (syslog servers)?

iatwal
Path Finder

I have these types of logs coming into Splunk today from 3 heavy forwarders (syslog servers) via inputs.conf apps I've deployed from a deployer.

Sep 27 07:11:08 hq1acptrvra1202.me.com ea_tomcat: env=ACPT  profile=claymore Sep 27 07:11:08 hq1acptrvra1202.me.com ea_tomcat: env=ACPT  profile=claymore  (nmon) CMD (/etc/nmon-logger/bin/nmon_helper.sh /etc/nmon-logger /var/log/nmon-logger >> /var/log/nmon-logger/nmon_collect.log 2>&1) 

I want to send all events with "nmon" in them to the Null Queue. I created an app to send out props/tranforms to the Heavy Forwarders and for consistency I sent the same to our cluster of indexers. Logs are still coming in. What are we missing?

Source:

/vcaclog/ACPT/broker-fad-api/hq1acptrvra0775.me.com/ea_tomcat.log

everything segment after /vcaclog/ can be dynamic.

props.conf

[source::/vcaclog/*]
TRANSFORMS-null= setnull-test

transforms.conf

[setnull-test]
REGEX = (?m)(nmon)
DEST_KEY = queue
FORMAT = nullQueue
0 Karma
1 Solution

FrankVl
Ultra Champion

Try:

[source::/vcaclog/...]

* doesn't match across / characters in source.

View solution in original post

FrankVl
Ultra Champion

Try:

[source::/vcaclog/...]

* doesn't match across / characters in source.

iatwal
Path Finder

Thank you this worked!

0 Karma

FrankVl
Ultra Champion

Glad to hear that 🙂

Please mark the answer as accepted, so others can also quickly find this as the correct answer if they stumble upon the same question 🙂

0 Karma

iatwal
Path Finder

so after /vcacalog/... I should have 3 dots?

0 Karma

493669
Super Champion
Get Updates on the Splunk Community!

Accelerating Observability as Code with the Splunk AI Assistant

We’ve seen in previous posts what Observability as Code (OaC) is and how it’s now essential for managing ...

Integrating Splunk Search API and Quarto to Create Reproducible Investigation ...

 Splunk is More Than Just the Web Console For Digital Forensics and Incident Response (DFIR) practitioners, ...

Congratulations to the 2025-2026 SplunkTrust!

Hello, Splunk Community! We are beyond thrilled to announce our newest group of SplunkTrust members!  The ...