Getting Data In

Windows Event filtering on Heavy-Forwarder

splunkcol
Builder

I am ingesting 100 windows machines and the events that are affecting my license consumption the most are 5156,5157,5158, 4658,4663, 4656, 4690.

I don't really know if I should filter them or if I can get out some correlation event that is valuable.

I have already filtered the first 2 according to Splunk documentation.

But my client doesn't want me to filter the EventCode if not the "Application Name"

What I see differently is that "EventCode" is pasted and "Aplication Name" has a blank space and I don't know how I should put the regular expression if I want to filter only by "Aplication name"

 

for example
Application Name: \device\harddiskvolume2\windows\system32\svchost.exe

props.conf
[WinEventLog:Security] TRANSFORMS-wmi=wminull

transforms.conf
[wminull] REGEX=(?m)^EventCode=(592|593) DEST_KEY=queue FORMAT=nullQueue

https://docs.splunk.com/Documentation/Splunk/6.6.2/Forwarding/Routeandfilterdatad

splunkcol_0-1620085893580.png

 

Labels (1)
0 Karma
1 Solution

splunkcol
Builder

 

I was able to achieve filtering like this

transforms.conf
[wminull] REGEX=(svchost.exe) DEST_KEY=queue FORMAT=nullQueue

 

View solution in original post

0 Karma

splunkcol
Builder

 

I was finally able to filter with this stanza

splunkcol_0-1630592863922.png

note: omit the bracket that is missing at the beginning

0 Karma

splunkcol
Builder

 

I was able to achieve filtering like this

transforms.conf
[wminull] REGEX=(svchost.exe) DEST_KEY=queue FORMAT=nullQueue

 

0 Karma
Get Updates on the Splunk Community!

Data Management Digest – December 2025

Welcome to the December edition of Data Management Digest! As we continue our journey of data innovation, the ...

Index This | What is broken 80% of the time by February?

December 2025 Edition   Hayyy Splunk Education Enthusiasts and the Eternally Curious!    We’re back with this ...

Unlock Faster Time-to-Value on Edge and Ingest Processor with New SPL2 Pipeline ...

Hello Splunk Community,   We're thrilled to share an exciting update that will help you manage your data more ...