Getting Data In

Can't get filter to work on windows

oilmouse
New Member
Hello,

I have a test script that writes out hello_d01 to hello_d10 every 5 seconds... for instance:

16:04:14.36 hello_d01
16:04:14.36 hello_d02
16:04:14.36 hello_d03
16:04:14.36 hello_d04
16:04:14.36 hello_d05
16:04:14.36 hello_d06
16:04:14.36 hello_d07
16:04:14.36 hello_d08
16:04:14.36 hello_d09
16:04:14.36 hello_d10

A splunk forwarder is setup to feed this log file to the indexer.

On the indexer, I have the following props.conf and transforms.conf, and I only want to keep the hello_d03 and hello_d04 events:

props.conf
  [source::c:\\a\\]
  TRANSFORMS-set=sco_setnull,sco_setparsing

transforms.conf
  [sco_setnull]
  REGEX = .
  DEST_KEY = queue
  FORMAT = nullQueue

  [sco_setparsing]
  REGEX = (hello_d03|hello_d04)
  DEST_KEY = queue
  FORMAT = indexQueue


Upon splunkd restart, I'm still seeing all hello_d01 to hello_d10 events as if there were no props.conf and transforms.conf.

What am I missing please?


Thanks.


Cheers,
Jack
Tags (1)
0 Karma

msettipane
Splunk Employee
Splunk Employee

If you are collecting data from a Universal Forwarder and want to filter at the indexer level, the following should work.

# ON UNIVERSAL FORWARDER

inputs.conf
[monitor://c:\program files\directory\log.txt]
sourcetype = xyz
index = abc

# ON INDEXER

# props.conf
# USING SOURCETYPE STANZA
[xyz]
TRANSFORMS-set= sco_setnull,sco_setparsing

# transforms.conf
# SENDS EVERYTHING TO NULLQUEUE FIRST
[sco_setnull]
REGEX = .
DEST_KEY = queue
FORMAT = nullQueue

# INDEX EVENTS THAT CONTAIN REGEX
[sco_setparsing]
REGEX = (hello_d03|hello_d04)
DEST_KEY = queue
FORMAT = indexQueue

Make sure you are not sending from a full forwarder where parsing might be done. Also make sure you do not have competing sourcetype or source stanzas on the same data source.

0 Karma

Runals
Motivator

Not sure where you stand on this but it showed up in the weekly newsletter.

The way I would probably try to tackle this is with a forwarder's input.conf file something along the lines of:

[monitor::<path>]
whitelist = (d03|d04)
index = yourIndex
sourcetype = yourSourceType

However, you've indicated you simply want to forwarder it all and let the indexer deal with it. I had more stuff written out but realized as I was looking at some documentation that turns out to be the same one MHibbin references above that your example lines up with the example given there. Don't know why it wouldn't be working other than perhaps your source:: path might be off just a touch perhaps. Hopefully someone else will chime in.

0 Karma

kristian_kolb
Ultra Champion

The events you are seeing are not the old events, right? You know that this will not alter anything that has already been indexed.

Well, this was probably not news to you, but I thought I'd mention it...
/k

0 Karma

oilmouse
New Member

I changed my test environment such that there is now no forwarder involved. It is now using Files & Directories Data Input.

It is still not working so I believe the problem is in props.conf and transforms.conf.

Any idea on how to debug this is appreciated.

Cheers,
Jack

0 Karma

oilmouse
New Member

I've updated my forwarder's inputs.conf and added sourcetype=dlog

In my props.conf, I now have:
[dlog]
TRANSFORMS-set=sco_setnull,sco_setparsing

Unfortunately it is still not working.

Any more clue please? Is there a way to verify transforms.conf please?

Thanks.

Cheers,
Jack

0 Karma

kristian_kolb
Ultra Champion

I suggest that you use the sourcetype of the data instead of the source. Removes one level of confusion regarding forward/backslashes and the need of escaping them.

In your inputs.conf you probably have a [monitor://c:\a\d.log] or similar. Underneath that you should specify a sourcetype, sourcetype=blah.

Then you can use that ([blah]) in your props.conf instead of [source::something_with_slashes].

NB: you don't have to create the sourcetype - splunk will already have created one for you (probably based off the file name).
/k

0 Karma

oilmouse
New Member

I tried these below and couldn't get it to work:

props.conf
[source::c:\a\]
TRANSFORMS-set=sco_setnull,sco_setparsing

props.conf
[source::c:\\a\\]
TRANSFORMS-set=sco_setnull,sco_setparsing

props.conf
[source::c:\a\d.log]
TRANSFORMS-set=sco_setnull,sco_setparsing

props.conf
[source::c:\\a\\d.log]
TRANSFORMS-set=sco_setnull,sco_setparsing

0 Karma

jonuwz
Influencer

your source filter in props.conf doesnt match that.

0 Karma

oilmouse
New Member

The source is c:\a\d.log

0 Karma

MHibbin
Influencer

Hi oilmouse,

Based on the assumption that you are using a Universal Forwarder from the tags you've used - if you have got a heavy forwarder I apologize - however, you require a Heavy Forwarder to route or filter data at the event level. Please see the following extract from the documentation:

Important: Only heavy forwarders can route or filter data at the event level. Universal forwarders and light forwarders do not have the ability to inspect individual events, but they can still forward data based on a data stream's host, source, or source type. They can also route based on the data's input stanza, as described below, in the subtopic, "Route inputs to specific indexers based on the data's input".

Ref: http://docs.splunk.com/Documentation/Splunk/5.0.2/Deploy/Routeandfilterdatad

0 Karma

oilmouse
New Member

Thank you for your reply.
My goal is to set up an universal forwarder that forwards everything, and let the indexer does the event filter and discards unwanted events.
Is this scenario supported please? And do I have the right setting in the config files please?
Thanks.
Cheers,
Jack

0 Karma

jonuwz
Influencer

whats the source on the events you recieve ? c:\a\ seems a bit strange ...

0 Karma
Get Updates on the Splunk Community!

What’s New in Splunk App for PCI Compliance 5.3.1?

The Splunk App for PCI Compliance allows customers to extend the power of their existing Splunk solution with ...

Extending Observability Content to Splunk Cloud

Register to join us !   In this Extending Observability Content to Splunk Cloud Tech Talk, you'll see how to ...

What's new in Splunk Cloud Platform 9.1.2312?

Hi Splunky people! We are excited to share the newest updates in Splunk Cloud Platform 9.1.2312! Analysts can ...