Building for the Splunk Platform

My Indexer is not processing "Keep specific events and discard the rest"

GersonGarcia
Path Finder

Hello all,
I have one app that generates a lot of data and it is killing my license. We need this data for sensitive customer only.
So, I followed
https://docs.splunk.com/Documentation/Splunk/6.5.2/Forwarding/Routeandfilterdatad
Keep specific events and discard the rest:
My props.conf is:

[metrics-app-gateway]
TRANSFORMS-set= setnull,setparsing

And trasnsforms.conf:

[setnull]
REGEX = .
DEST_KEY = queue
FORMAT = nullQueue

[setparsing]
REGEX=lasconpr01vmw08.las.ssnsgs.net
DEST_KEY = queue
FORMAT = indexQueue

But for some reason it is not working. When I comment out #TRANSFORMS-set= in props.conf, I ingested the file properly.
How can I troubleshoot this? How come it is not passing by REGEX?

Thank you
Gerson Garcia.

Tags (1)
0 Karma
1 Solution

chrisyounger
SplunkTrust
SplunkTrust

Hi @GersonGarcia

Does the string lasconpr01vmw08.las.ssnsgs.net exist in the _raw message? This won't match the MetaData fields

Secondly, this props and transforms are parse time (unless using indexed extractions) that means they need to go on the Indexer, or heavy forwarder if the data goes through a heavy forwarder.

Hope this helps.

View solution in original post

0 Karma

chrisyounger
SplunkTrust
SplunkTrust

Hi @GersonGarcia

Does the string lasconpr01vmw08.las.ssnsgs.net exist in the _raw message? This won't match the MetaData fields

Secondly, this props and transforms are parse time (unless using indexed extractions) that means they need to go on the Indexer, or heavy forwarder if the data goes through a heavy forwarder.

Hope this helps.

0 Karma

GersonGarcia
Path Finder

Ah, that may be the reason. Is there any way I can use MetaData field something like:
REGEX=host::lasconpr01vmw08.las.ssnsgs.net

The props.conf and transforms.conf are in the Indexer

Thank you.

0 Karma

chrisyounger
SplunkTrust
SplunkTrust

This should work

[host::lasconpr01vmw08.las.ssnsgs.net]
TRANSFORMS-set= setparsing

[host::<other hosts>]
TRANSFORMS-set= setnull

Note you will have to set the other_hosts yourself.

0 Karma

GersonGarcia
Path Finder

But hold on, if I do that, I will exclude everything else (all apps and logs):

[host::<other hosts>]
TRANSFORMS-set= setnull

Right?

0 Karma

chrisyounger
SplunkTrust
SplunkTrust

Yes that's right. So you should set it very carefully. Sorry I should have made that clearer.

0 Karma

GersonGarcia
Path Finder

I can't do that, I just need to stop index this particular stanza [metrics-app-gateway] for all hosts except lasconpr01vmw08.las.ssnsgs.net.

0 Karma

chrisyounger
SplunkTrust
SplunkTrust

OK give this a try. Use the configs from your original message but add SOURCE_KEY like so

[setparsing]
REGEX=lasconpr01vmw08.las.ssnsgs.net
DEST_KEY = queue
SOURCE_KEY = MetaData:Host
FORMAT = indexQueue
0 Karma

GersonGarcia
Path Finder

It didn't work. It is not finding the REGEX. In the documentation I found:

Data:Host : The host associated with the event.
The value must be prefixed by "host::"

What does it mean?

In my inputs.conf I have:

[monitor:///usr/ssn/gateway/logs/metrics.log]
host = lasconpr01vmw08.las.ssnsgs.net
sourcetype = metrics-app-gateway
_meta = ssnservice::CONED-PROD01
index = ssn

Should I add:

[monitor:///usr/ssn/gateway/logs/metrics.log]
host = lasconpr01vmw08.las.ssnsgs.net
sourcetype = metrics-app-gateway
_meta = ssnservice::CONED-PROD01 host::lasconpr01vmw08.las.ssnsgs.net
index = ssn
0 Karma
Get Updates on the Splunk Community!

Splunk Forwarders and Forced Time Based Load Balancing

Splunk customers use universal forwarders to collect and send data to Splunk. A universal forwarder can send ...

NEW! Log Views in Splunk Observability Dashboards Gives Context From a Single Page

Today, Splunk Observability releases log views, a new feature for users to add their logs data from Splunk Log ...

Last Chance to Submit Your Paper For BSides Splunk - Deadline is August 12th!

Hello everyone! Don't wait to submit - The deadline is August 12th! We have truly missed the community so ...