Getting Data In

sending specific events to nullqueue using props & transforms

Sid
Explorer

I am trying to setup props & transforms to send DEBUG events to null queue
i tried below regex but that doesnt seem to work

Transofrms.conf-

[setnull]
REGEX = .+(DEBUG...).+$
DEST_KEY = queue
FORMAT = nullQueue
props.conf-
[sourcetype::risktrac_log]
TRANSFORMS-null=setnull

I used 

REGEX=\[\d{2}\/\d{2}\/\d{2}\s\d{2}:\d{2}:\d{2}:\d{3}\sEDT]\s+DEBUG\s.*

as well but that too doesnt drop DEBUG messages 

i just tried DEBUG in regex too, no help, can someone help me here please?

sample event- 
[10/13/23 03:46:48:551 EDT] DEBUG DocumentCleanup.run 117 : /_documents document cleanup complete.
how does REGEX pick the pattern ? i can see both the REGEX are able to match whole event.

we cant turn DEBUG off for the application

0 Karma
1 Solution

gcusello
SplunkTrust
SplunkTrust

Hi @Sid,

at first, if you use sourcetype in te stanza header, you don't need to specify sourcetype:

[risktrac_log]
TRANSFORMS-null=setnull

then use an easier regex:

[setnull]
REGEX = DEBUG
DEST_KEY = queue
FORMAT = nullQueue

At least, where do you located these conf files?

they must be in the first full Splunk instance that the logs passing through, in other words on the first Heavy Forwarders or, if not present, on the Indexers, not on the Universal Forwarders.

Ciao.

Giuseppe

View solution in original post

gcusello
SplunkTrust
SplunkTrust

Hi @Sid,

at first, if you use sourcetype in te stanza header, you don't need to specify sourcetype:

[risktrac_log]
TRANSFORMS-null=setnull

then use an easier regex:

[setnull]
REGEX = DEBUG
DEST_KEY = queue
FORMAT = nullQueue

At least, where do you located these conf files?

they must be in the first full Splunk instance that the logs passing through, in other words on the first Heavy Forwarders or, if not present, on the Indexers, not on the Universal Forwarders.

Ciao.

Giuseppe

Sid
Explorer

@gcusello thank you,  yes i am keeping it on the indexer.

regarding that a quick query , its a cloud environment(classic) & i am keeping props&transforms on the splunk cloud indexers , if we drop these events from splunkcloud indxers using props&tranforms would it still count against SVCs? I am asking this because the null queue would happen after parsing so the processing is happening.
in on-prem as far as i know it wont count against licensing because indexing wont happen, how does it work in splunkcloud

0 Karma

PickleRick
SplunkTrust
SplunkTrust

There are two types of licensing for the Cloud.

One is ingest pricing where you play for the amount of data which gets indexed. In this model if you drop events before indexing they don't get written to the index so they don't count against your license.

Another model is workload licensing where you pay for the "computing power" for processing your data and allocated storage. In this case dropping events will not affect license usage directly.

gcusello
SplunkTrust
SplunkTrust

Hi @Sid,

when I usieSplunk Cloud, I usually have one or two Heavy Forwarders on premise that I use to concentrate all the logs from my on-premise infratructure, so I can apply the configurations to these HFs.

If you haven't an on-premise HF directly sending logs to Splunk Cloud, you have to upload these two conf files in an Add-On.

Ciao.

Giuseppe

Sid
Explorer

@gcusello 

its not dropping those DEBUG ones

props-

[risktrac_log]
LINE_BREAKER = ([\r\n]+)\[\d{2}.\d{2}.\d{2}.\d{2}.\d{2}.\d{2}.\d+.\w+
MAX_TIMESTAMP_LOOKAHEAD = 35
SHOULD_LINEMERGE = 0
TIME_PREFIX = ^
TRUNCATE = 99999
pulldown_type = 1
TRANSFORMS-null = setnull
 
transforms-
[setnull]
REGEX = DEBUG
DEST_KEY = queue
FORMAT = nullQueue
0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Sid ,

where did you located conf files: on Splunk Cloud or on on-premise systems?

As I said, they must be located on the first full Splunk instance that data pass through.

Ciao.

Giuseppe

0 Karma

Sid
Explorer

i dont have any HF in my environment, so i kept it as custom app on cloud indexers.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Sid ,

how did you passed the two conf files to Splunk Cloud?

The conf files are correct.

Ciao.

Giuseppe

 

0 Karma

Sid
Explorer

Hi @gcusello ,
created the add-on using addon builder and uploaded it on splunkcloud SH.

in classic experience it is deployed on the indexers as well.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Sid ,

I'm not sure that uploading an app in Splunk Cloud it's located in the Indexers.

Opena a case to Splunk Support for this.

Ciao.

Giuseppe

0 Karma

Sid
Explorer

Hi @gcusello ,

it does , we are doing indexer level props & trasnforms for other sourcetypes as well & it is working fine. 
document also says the same
Manage private apps on your Splunk Cloud Platform deployment - Splunk Documentation

"When you install an app using self-service app installation on Classic Experience, the app is automatically installed on all regular search heads and search head cluster members across your deployment. The app is also installed on indexers"

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Sid ,

open a Case to Splunk Support, to be sure.

Ciao.

Giuseppe

Sid
Explorer

Hi @gcusello ,

it was setnull stanza which was being used by another app and was taking precedence over this one that is why it was not taken into consideration .

i changed the setnull stanza in tranforms to a more meaningful & unique name and that worked ..

Thanks a lot for your help.

0 Karma

PickleRick
SplunkTrust
SplunkTrust

Lessons learned:

1) Use btool (or REST in case of Cloud) to see effective config.

2) Use unique naming schema in order not to accidentally clash with settings from other chunks of config.

Sid
Explorer

@PickleRick 
how would you use REST in splunkcloud indexers? isnt it restricted to SH only

0 Karma

PickleRick
SplunkTrust
SplunkTrust

True. But the same app is getting pushed to indexers and to SHs so your REST querying for transform definition should return the same result instead of whether it's called against SH or idx.

0 Karma

gcusello
SplunkTrust
SplunkTrust

Hi @Sid,

how are you taking these logs, from a Universal Forwarder?

Ciao.

Giuseppe

0 Karma

Sid
Explorer

Hi @gcusello 
yes UFs 

0 Karma
Get Updates on the Splunk Community!

Join Us for Splunk University and Get Your Bootcamp Game On!

If you know, you know! Splunk University is the vibe this summer so register today for bootcamps galore ...

.conf24 | Learning Tracks for Security, Observability, Platform, and Developers!

.conf24 is taking place at The Venetian in Las Vegas from June 11 - 14. Continue reading to learn about the ...

Announcing Scheduled Export GA for Dashboard Studio

We're excited to announce the general availability of Scheduled Export for Dashboard Studio. Starting in ...