Deployment Architecture

Splunk Universal Forwader: Data sending to two set of indexers

koshyk
Super Champion

hi folks,

The scenario is like below
1. Got a single deployment-server
2. Got two indexer cluster each without about 20+ indexers. One exclusively for Security purposes and other indexer cluster for performance/capacity/metrics/app-support etc.
3. Need only single Splunk UF instance installed on the end-points/clients

Requirement is
1. From the same Splunk UF, security information (eg wineventlogs, secure, auth logs) needs to be sent to indexer1_group && the perfmon/metrics/application dataset etc, needs to be sent to indexer2_group. So not data cloning but specific datasets
2. Load balancing to be done only to the indexers within the same group/cluster

I'm well aware of the data cloning capability, but my above requirement is slightly different. Can this be achieved?
We already do the inputs.conf in modular fashion, so particular sourcetype/source can be sent to relevant index.
But trying to find a way to redirect specific data to relevant indexer cluster without need of Heavy forwarders

0 Karma
1 Solution

FrankVl
Ultra Champion

Edit: I forgot about the option mentioned by @vinod94: https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Routeandfilterdatad#Route_inputs_to_s...
That would indeed be the way to go, assuming the separate data sets indeed are from separate input configurations.

If the data does not come from separate inputs, then the only option is transforms based routing and filtering, which is something for Heavy Forwarders.

So there are 3 options to solve this within Splunk:

1: Route by input, using the UF, as explained in the link above.
2: clone and drop at the indexers (if the additional network bandwidth is acceptable)
3: send to a set of intermediate heavy forwarders that perform the routing and filtering (be careful not to create a bottleneck that causes poor data distribution across your indexers).

View solution in original post

FrankVl
Ultra Champion

Edit: I forgot about the option mentioned by @vinod94: https://docs.splunk.com/Documentation/Splunk/latest/Forwarding/Routeandfilterdatad#Route_inputs_to_s...
That would indeed be the way to go, assuming the separate data sets indeed are from separate input configurations.

If the data does not come from separate inputs, then the only option is transforms based routing and filtering, which is something for Heavy Forwarders.

So there are 3 options to solve this within Splunk:

1: Route by input, using the UF, as explained in the link above.
2: clone and drop at the indexers (if the additional network bandwidth is acceptable)
3: send to a set of intermediate heavy forwarders that perform the routing and filtering (be careful not to create a bottleneck that causes poor data distribution across your indexers).

koshyk
Super Champion

thanks for the different options and on the _TCP_ROUTING bit link.

0 Karma

vinod94
Contributor

In the link it says it can be done through uf ... i thought that could be a workaround! But thanks for clarifying it 🙂

0 Karma

FrankVl
Ultra Champion

You were correct. It can be done through UF as long as the separate data sets come from separate inputs. Let me clarify that a bit more in my answer. I actually missed that myself when I originally wrote my answer. Sorry for the confusion 🙂

0 Karma

vinod94
Contributor

koshyk
Super Champion

upvoted for your help

0 Karma

koshyk
Super Champion

transforms is unfortunately in Heavy forwarders (or full Splunk enterprise). not in UF

0 Karma

nareshinsvu
Builder

Do you have an option to create file-shares for your wineventlogs, secure, auth logs and give access to only indexer1_group? - through firewalls, secured permissions blah blah..

This way, you can directly read the logs from your indexer and no need of any forwarder.

eg: inputs.conf

[monitor:\[remote-filke-share-name[your auth log name]]]

0 Karma

FrankVl
Ultra Champion

You can't read windows event logs from file (unless you install some tool to actually read it from the API and write it to a file first, but that seems overly complicated).

You can read it remotely using WMI, but that performs / scales very poorly and is not recommended.

0 Karma
Get Updates on the Splunk Community!

Enterprise Security Content Update (ESCU) | New Releases

In September, the Splunk Threat Research Team had two releases of new security content via the Enterprise ...

New in Observability - Improvements to Custom Metrics SLOs, Log Observer Connect & ...

The latest enhancements to the Splunk observability portfolio deliver improved SLO management accuracy, better ...

Improve Data Pipelines Using Splunk Data Management

  Register Now   This Tech Talk will explore the pipeline management offerings Edge Processor and Ingest ...