Getting Data In

Why would a universal forwarder be needed if it is unable to restrict or filter data?

vikas_gopal
Builder

Hi Experts,

Please clarify my doubts regarding the Universal Forwarder:
1) Is installing the UF on 60 machines (mix of Linux/Windows) a good option or is pulling data (like remote data) a better option?
2) Since UF does not support filtering, what would be the best way to do it ? I am aware I can achieve it with a heavy forwarder but I do not want to do that.

Thanks
VG

0 Karma
1 Solution

jkat54
SplunkTrust
SplunkTrust

Actually I don't need the answer to that to answer your question.

If you just want to discard the data once it reaches the indexers, you can use SEDCMD in props.conf or even a TRANSFORM in props and some magic in transforms.conf.

If you want to discard the data before it reaches the indexers, you will need the heavy forwarders. This is usually the case when you have a remote location that is sending sensitive data through other networks on its way to the indexers. Instead of discarding or redacting the sensitive data after it's been transmitted through other networks, you redact it at the heavy forwarder and then send it over the other networks to your indexers.

So are you sending sensitive data over public networks, or do you just need to discard some of the data before its indexed?

View solution in original post

wyfwa4
Communicator

A windows UF does support filtering (at least from v6.3) as you can define high specific event codes you want to send.

In addition, it reduces network traffic compared to remote data collection and also offloads some of the processing load to the client ( will send cooked data to the indexer). When trying to collect data remotely, you have to manage remote access rights such as using needing a domain account with administration rights on the remote windows box, or havingto manage remote credentials from lots of client boxes. For Windows, remote wmi calls can be heavy and will likely result in both the client and server being more heavily loaded. Linux may be better able to allows remote data collection without the same impact.

A heavy forwarder has all the benefits of a UF, but requires more management as you need to manage your parsing rules in multiple places, and it takes more local resources to run. The only reason to use a heavy forwarder rather than a UF, is if you have considerable network bandwidth issues and need to perform a lot of complex filtering or routing on the data.

0 Karma

jkat54
SplunkTrust
SplunkTrust

Actually I don't need the answer to that to answer your question.

If you just want to discard the data once it reaches the indexers, you can use SEDCMD in props.conf or even a TRANSFORM in props and some magic in transforms.conf.

If you want to discard the data before it reaches the indexers, you will need the heavy forwarders. This is usually the case when you have a remote location that is sending sensitive data through other networks on its way to the indexers. Instead of discarding or redacting the sensitive data after it's been transmitted through other networks, you redact it at the heavy forwarder and then send it over the other networks to your indexers.

So are you sending sensitive data over public networks, or do you just need to discard some of the data before its indexed?

jkat54
SplunkTrust
SplunkTrust

You will want to blacklist by KvP in your inputs.conf on the forwarders in this case...

[WinEventLog::Application]
...
blacklist=Level=Information
blacklist1=Level=Warning

0 Karma

vikas_gopal
Builder

Cool thanks man will start working on it .

0 Karma

vikas_gopal
Builder

Well I just want to discard some of the data before its indexed with UF only e.g in windows event log I do not want to index everything . So something like filtering by event level (Critical,Warning ,Verbose, Error, Informational), so out of these i just need (critical,verbose,error) rest (warning , informational) I want to discard .

0 Karma

jkat54
SplunkTrust
SplunkTrust

What filtering did you want to do?

0 Karma
Get Updates on the Splunk Community!

Routing Data to Different Splunk Indexes in the OpenTelemetry Collector

This blog post is part of an ongoing series on OpenTelemetry. The OpenTelemetry project is the second largest ...

Getting Started with AIOps: Event Correlation Basics and Alert Storm Detection in ...

Getting Started with AIOps:Event Correlation Basics and Alert Storm Detection in Splunk IT Service ...

Register to Attend BSides SPL 2022 - It's all Happening October 18!

Join like-minded individuals for technical sessions on everything Splunk!  This is a community-led and run ...