Splunk Search

How to Silently Drop Events to nullQueue While Logging Skipped Event Metadata to a File in a Custom TA

asees
Explorer

I am building a custom Technology Add-on (TA) where I need to silently drop specific events using nullQueue but also log metadata about those dropped events to a separate log file for auditing purposes.

Here’s my scenario:

My Current Setup

  1. props.conf:

    [custom:app]
    TRUNCATE = 0
    TRANSFORMS-routing = route_network, route_app_events
  2. transforms.conf:

    # Drop all network heartbeat events
    [route_network]
    REGEX = .*CEF:0\|MyCompany\|NetworkMonitor\|[^|]+\|[^|]+\|Heartbeat\|
    DEST_KEY = queue
    FORMAT = nullQueue

    # Drop specific Windows events coming in CEF
    [route_app_events]
    REGEX = .*CEF:0\|Microsoft\|Windows\|[^|]+\|[^|]+\|(AppCrash|UpdateService|Security-Auditing|LicensingService)\|
    DEST_KEY = queue
    FORMAT = nullQueue

With the above configuration:

  • Any events matching these rules are discarded silently — which works perfectly.

  • However, I also need to log each dropped event type to a file like this:

    [2025-09-08 14:05:22] Network heartbeat event skipped
    [2025-09-08 14:10:37] Windows AppCrash event skipped

My Requirement

I need to:

  1. Continue silently dropping these events using nullQueue (no indexing or storage in Splunk index).

  2. Simultaneously write a small log entry to a file (e.g., $SPLUNK_HOME/var/log/splunk/skipped_events.log) whenever an event is skipped, for operational tracking.

Labels (1)
0 Karma

PickleRick
SplunkTrust
SplunkTrust

There's no way to do it using built-in props/transforms functionality. Yes, you can filter out events. Yes, you could strip them to some minimal version and redirect to another index. No, you cannot write to a text file.

A very very very ugly walkaround could be to reroute such events to syslog and set up a local syslog receiver but this is a Very Very Bad Idea (tm).

0 Karma

asees
Explorer

@PickleRick 
Is there any way we can use python script in anyway to achieve this?

0 Karma

PickleRick
SplunkTrust
SplunkTrust

If the data is already ingested into Splunk's "pipeline" - no.

You could use python to create a modular input but that would work on an earlier step - betore the data is injected into input queue.

richgalloway
SplunkTrust
SplunkTrust

You'll need to create a modular input to do that.  Use regular expressions to test the incoming data, discard matches and log the activity.

---
If this reply helps you, Karma would be appreciated.
0 Karma

asees
Explorer

@richgalloway Hey, can you please explain me how to do it?

0 Karma
Get Updates on the Splunk Community!

[Puzzles] Solve, Learn, Repeat: Dynamic formatting from XML events

This challenge was first posted on Slack #puzzles channelFor a previous puzzle, I needed a set of fixed-length ...

Enter the Agentic Era with Splunk AI Assistant for SPL 1.4

  🚀 Your data just got a serious AI upgrade — are you ready? Say hello to the Agentic Era with the ...

Stronger Security with Federated Search for S3, GCP SQL & Australian Threat ...

Splunk Lantern is a Splunk customer success center that provides advice from Splunk experts on valuable data ...